As researchers work to increase human-computer interactivity, the lines between real and digital worlds are blurring. Augmented reality (AR), just in its infant stage, may be set to explode. As the founders of Bubbli, a startup developing an AR iPhone app, said in a recent Silicon Valley Blog post by Barry Bazzell: “Once we understand reality through a camera lens … the virtual and real become indistinguishable.’”
Kevin Kelly, co-founder and senior maverick at Wired magazine, recently pointed out in a keynote speech at TOC 2011 that soon the computers we’re looking at would be looking back at us (the image above is from Kelly’s presentation).
“Soon” turns out to be now: Developers at Swedish company Tobii Technology have created 20 computers that are controlled by eye movement. Tom Simonite described the technology in a recent post for MIT’s Technology Review:
The two cameras below the laptop’s screen use infrared light to track a user’s pupils. An infrared light source located next to the cameras lights up the user’s face and creates a “glint” in the eyes that can be accurately tracked. The position of those points is used to create a 3-D model of the eyes that is used to calculate what part of the screen the user is looking at; the information is updated 40 times per second.
The exciting part here is a comment in the article by Barbara Barclay, general manager of Tobii North America:
We built this conceptual prototype to see how close we are to being ready to use eye tracking for the mass market … We think it may be ready.
Tobii released the following video to explain the system:
This type of increased interaction has potential across industries — intelligent search, AR, personalization, recommendations, to name just a few channels. Search models built on interaction data gathered directly from the user could also augment the social aggregation that search engine companies are currently focused on. Engines could incorporate what you like with what you see and do.