n the last issue of MIT’s Technology Review, there’s an interesting report, An Alternative to the Computer Mouse on Manu Kumar’s new way to interact with your computer via eye-sight. The innovative part comes in the combination of eye and hand: you fix your stare on a given screen region, but that has no effect unless you press the appropriate key. Once you do that, magic happens. For instance, the region you were looking at gets amplified. Or if you were reading a text near the end of the visible area, it will scroll automatically for you. There’s no secret hardware involved (a high-resolution web cam and a bunch of infrared light-emitting diodes), although that does not mean that the equipment is cheap: Kumar uses a $25,000 monitor! The tricky part happens to be a good eye-tracking algorithm, because it seems pupils are prone to jitter even when fixated on a point. Kumar’s solution is not yet perfect (tests show that it misbehaves around 20% of the times), but people participating in tests seem to prefer eyes to mouses. The system is called EyePoint, and you’ll find some screenshots here or, download a demo video here. The project is integrated in Standford HCI Group’s GUIDe (Gaze-enhanced User Interface Design, if you have to know), which includes other curious applications like EyeExposé (i’ll let you try and imagine what it does).
Although not as fancy as other exotic input methods we’ve seen in the past, this one looks much more practical: i can perfectly picture myself using EyePoint together with my keyboard and Emacs to browse around during code editing sessions. I’ll have to wait until prices drop a little bit, and algorithms get better, but that should be only a question of time.