Friday, April 30, 2010

GazePad: Low-cost remote webcam eye tracking

Came across the GazeLib low-cost remote eye tracking project today which uses ordinary webcams without IR illumination. The accuracy is pretty low but it's really nice to see another low-cost approach for assistive technology.

"GazeLib is a programming library which making real-time low-cost gaze tracking becomes possible. The library provide functions performing remote gaze tracking under ambient lighting condition using a single, low cost, off-the-shelf webcam. Developers can easily build gaze tracking technologies implemented applications in only few lines of code. GazeLib project focuses on promoting gaze tracking technology to consumer-grade human computer interfaces by reducing the price, emphasizing ease-of-use, increasing the extendibility, and enhancing the flexibility and mobility."

Monday, April 26, 2010

Freie Universität Berlin presents gaze controlled car

From the Freie Universität in Berlin comes a working prototype for a systems that allows direct steering by eye movements alone. The prototype was demonstrated in front of a large group journalist at the former Berlin Tempelhof Airport. Gaze data from a head-mounted SMI eye tracker is feed into the control system of the Spirit of Berlin, a platform for autonomous navigation. Similar to the gaze controlled robot we presented at CHI09 the platform offers a coupling between the turning of the wheels and the gaze data coordinate space (eg. look left and car drives left). Essentially its a mapping onto a 2D plane where deviations from the center issues steering commands and the degree of turning is modulated by the distance. Potentially interesting when coupled with other sensors that in combination offers offer driver support, for example if an object in the vehicles path that driver has not seen. Not to mention scenarios including individuals with disabilities and/or machine learning. The work has been carried out under guidance by professor Raúl Rojas as part AutoNOMOS project which has been running since 2006 after inspiration from the Stanford autonomos car project.

More info in the press-release.

Sunday, April 25, 2010

Wednesday, April 14, 2010

Open-source gaze tracker awarded Research Pearls of ITU Copenhagen

The open-source eye tracker ITU Gaze Tracker primarily developed by Javier San Augustin, Henrik Skovsgaard and myself has been awarded the Research Pearls of the IT University of Copenhagen. A presentation will be held at ITU on May 6th at 2pm. The software released one year ago have seen more than 5000 downloads by students and hobbyist around the world. It's rapidly approaching a new release which will offer better performance and stability for remote tracking and many bug fixes in general. The new version adds support for a whole range of new HD web cameras. These provides a vastly improved image quality that finally brings hope for a low-cost, open, flexible and reasonably performing solution. The ambitious goal strives to make eye tracking technology available for everyone, regardless of available resources. Follow the developments at the forum. Additional information is available at the ITU Gaze Group.

"The Open-Source ITU Gaze Tracker"

Gaze tracking offers them the possibility of interacting with a computer by just using eye movements, thereby making users more independent. However, some people (for example users with a severe disability) are excluded from access to gaze interaction due to the high prices of commercial systems (above 10.000€). Gaze tracking systems built from low-cost and off-the-shelf components have the potential of facilitating access to the technology and bring prices down.

The ITU Gaze Tracker is an off-the-shelf system that uses an inexpensive web cam or a video camera to track the user’s eye. It is free and open-source, offering users the possibility of trying out gaze interaction technology for a cost as low as 20€, and to adapt and extend the software to suit specific needs.

In this talk we will present the open-source ITU Gaze Tracker and show the different scenarios in which the system has been used and evaluated.

Monday, April 12, 2010

Digital avatars gets human-like eye movements

William Steptoe of University College London got his research on using eye tracking to give digital avatars human-like eye movements covered in an article by New Scientist. It turns out that "on average, the participants were able to identify 88 per cent of truths correctly when the avatars had eye movement, but only 70 per cent without. Spotting lies was harder, but eye movement helped: 48 per cent accuracy compared with 39 per cent without. Steptoe will present the results at the 2010 Conference on Human Factors in Computing Systems in Atlanta, Georgia, next week."

Eye tracking in the wild: Consumer decision-making process at the supermarket

Kerstin Gidlöf from the Lund University Humlab talks about the visual appearance of consumer products in the supermarket and how the graphical layout modulates our attention. Perhaps the free will is just an illusion, however number of items in my fridge containing faces equals zero. Is it me or the store I'm shopping at?