Have you ever sat on the bus, texting your BFF about your plans when you notice the person sitting next to you with their eyes glued to the display?
A new research project from Google may make that a thing of the past — or at least shame those who take part.
Google researchers Hee Jung Ryu and Florian Schroff have developed a project dubbed "electronic screen protector," in which they use the Google Pixel's front-facing camera and artificial intelligence that detects eyes to tell when more then one person is actively looking at the display. Quartz first spotted the video demonstration on Ryu's YouTube account, which is public but unlisted.
NIPS 2017 Accompaniment Demo Video
More From Tom's Guide
The demo shows someone sneaking up on an unsuspecting phone user to see what they're doing. But the phone then detects the second face and alerts the user that someone is using the phone. The whole thing looks pretty rudimentary, but it's still in research phase. Ryu and Schroff will present their findings at the Neural Information Processing Systems conference in Long Beach, CA next week.
According to Quartz, the whole algorithm is run on the phone and can detect a face in many lighting conditions in just 2 milliseconds. We don't know if this feature will make it to mass market as part of Android or the Pixel lineup. I'm personally curious to see if there are any ways for it to tell that you're actively sharing a phone with someone (say, watching a video together) and for it not to trigger the alert.
Credit: Hee Jung Ryu