Wednesday, July 22, 2009
Telegaze update
Update: The new version includes an automatic "person-following" mode which can be turned on or off through the interface. See video below
Gaze Interaction in Immersive Virtual Reality - 3D Eye Tracking in Virtual Worlds
Publications
- Pfeiffer, T. (2008). Towards Gaze Interaction in Immersive Virtual Reality: Evaluation of a Monocular Eye Tracking Set-Up. In Virtuelle und Erweiterte Realität - Fünfter Workshop der GI-Fachgruppe VR/AR, 81-92. Aachen: Shaker Verlag GmbH. [Abstract] [BibTeX] [PDF]
- Pfeiffer, T., Latoschik, M.E. & Wachsmuth, I. (2008). Evaluation of Binocular Eye Trackers and Algorithms for 3D Gaze Interaction in Virtual Reality Environments. Journal of Virtual Reality and Broadcasting, 5 (16), dec. [Abstract] [BibTeX] [URL] [PDF]
- Pfeiffer, T., Donner, M., Latoschik, M.E. & Wachsmuth, I. (2007). 3D fixations in real and virtual scenarios. Journal of Eye Movement Research, Special issue: Abstracts of the ECEM 2007, 13.
- Pfeiffer, T., Donner, M., Latoschik, M.E. & Wachsmuth, I. (2007). Blickfixationstiefe in stereoskopischen VR-Umgebungen: Eine vergleichende Studie. In Vierter Workshop Virtuelle und Erweiterte Realität der GI-Fachgruppe VR/AR, 113-124. Aachen: Shaker. [Abstract] [BibTeX] [PDF]
List of all publications available here.
Wednesday, July 15, 2009
Gaze & Voice recognition game development blog
Keep us posten Jonathan, excitied to see what you'll come up with!
Update:
The project resulted in the Rabbit Run game which is documented in the following publication:
- J. O’Donovan, J. Ward, S. Hodgins, V. Sundstedt (2009) Rabbit Run: Gaze and Voice Based Game Interaction (PDF).
Monday, July 13, 2009
Oculis labs Chameleon prevents over shoulder reading
Patent application
Article by Baltimore Sun
Monday, June 29, 2009
Video from COGAIN2009
Monday, June 1, 2009
COGAIN 2009 Proceedings now online
Tuesday, May 26, 2009
Toshiba eye tracking for automotive applications
Via Donald Melanson at Engadget:
"We've seen plenty of systems that rely on facial recognition for an interface, but they've so far been a decidedly rarer occurrence when it comes to in-car systems. Toshiba looks set to change that, however, with it now showing off a new system that'll not only let you control the A/C or radio with the glance of your eye, but alert you if you happen to take your eyes off the road for too long. That's done with the aid of a camera mounted above the steering wheel that's used to identify and map out the driver's face, letting the car (or desktop PC in this demonstration) detect everything from head movement and eye direction to eyelid blinks, which Toshiba says could eventually be used to alert drowsy drivers. Unfortunately, Toshiba doesn't have any immediate plans to commercialize the technology, although it apparently busily working to make it more suited for embedded CPUs." (source)
Tuesday, May 19, 2009
Hands-free Interactive Image Segmentation Using Eyegaze (Sadeghi, M. et al, 2009)
Abstract
"This paper explores a novel approach to interactive user-guided image segmentation, using eyegaze information as an input. The method includes three steps: 1) eyegaze tracking for providing user input, such as setting object and background seed pixel selection; 2) an optimization method for image labeling that is constrained or affected by user input; and 3) linking the two previous steps via a graphical user interface for displaying the images and other controls to the user and for providing real-time visual feedback of eyegaze and seed locations, thus enabling the interactive segmentation procedure. We developed a new graphical user interface supported by an eyegaze tracking monitor to capture the user's eyegaze movement and fixations (as opposed to traditional mouse moving and clicking). The user simply looks at different parts of the screen to select which image to segment, to perform foreground and background seed placement and to set optional segmentation parameters. There is an eyegaze-controlled "zoom" feature for difficult images containing objects with narrow parts, holes or weak boundaries. The image is then segmented using the random walker image segmentation method. We performed a pilot study with 7 subjects who segmented synthetic, natural and real medical images. Our results show that getting used the new interface takes about only 5 minutes. Compared with traditional mouse-based control, the new eyegaze approach provided a 18.6% speed improvement for more than 90% of images with high object-background contrast. However, for low contrast and more difficult images it took longer to place seeds using the eyegaze-based "zoom" to relax the required eyegaze accuracy of seed placement." Download paper as pdf.
The custom interface is used to place backgound (red) and object (green) seeds which are used in the segmentation process. The custom fixation detection algorithm triggers a mouse click to the gaze position, if 20 of the previous 30 gaze samples lies within a a 50 pixel radius.
The results indicate a certain degree of feasibility for gaze assisted segmentation, however real-life situations often contain more complex images where borders of objects are less defined. This is also indicated in the results where the CT brain scan represents the difficult category. For an initial study the results are interesting and it's likely that we'll see more of gaze interaction within domain specific applications in a near future.
- Maryam Sadeghi, Geoff Tien, Ghassan Hamarneh, and Stella Atkins. Hands-free Interactive Image Segmentation Using Eyegaze. In SPIE Medical Imaging 2009: Computer-Aided Diagnosis. Proceedings of the SPIE, Volume 7260 (pdf)
Wednesday, May 13, 2009
GaCIT 2009 : Summer School on Gaze, Communication, and Interaction Technology
The GaCIT workshop is organized by the graduate school on User-Centered Information Technology at the University of Tampere, Finland (map). The workshop runs between July 27-31. I attended last year and found it to be great week with interesting talks and social events. See the day-by-day coverage of the GaCIT 2008.
Topics and speakers:
Introduction to Gaze-based Communication (Howell Istance)
Evaluation of Text Entry Techniques (Scott MacKenzie)
Survey of text entry methods. Models, metrics, and procedures for evaluating text entry methods.Details of Keyboards and Users Matter (Päivi Majaranta)
Issues specific to eye-tracker use of soft keyboards, special issues in evaluating text entry techniques with users that use eye trackers for communication.Communication by Eyes without Computers (TBA)
Introduction to eye-based communication using low-tech devices.Gesture-based Text Entr Techniques (Poika Isokoski)
Overview of studies evaluating techniques such as Dasher, QuikWrite and EdgeWrite in the eye-tracker contextLow-cost Devices and the Future of Gaze-based Text Entry (John Paulin Hansen)
Low-cost eye tracking and its implications for text entry systems. Future of gaze-based text entry.Dwell-free text entry techniques (Anke Huckauf)
Introduction to gaze-based techniques that do not utilize the dwell-time protocol for item selection.
Hi –fi eyetracking with a lo-fi eyetracker: An experimental usability study of an eyetracker built from a standard web camara (Barret, M., 2009)
"Innovation has facilitated sufficient mainstream technology to build eyetrackers from off-the-shelf-components. Prices for standard eyetrackers start at around € 4000. This thesis describes an experimental usabilty study of gazetyping with a new input device built from a standard web camera without hardware modifications. Cost: € 20. Mainstreaming of assistive technologies holds potential for faster innovation, better service, lower prices and increased accessibility. Off-the-shelf-eyetrackers must be usability competitive to standard eyetrackers in order to be adopted, as eyetracking - even with expensive hardware - presents usability issues. Usability is defined as effectiveness, efficiency and user satisfaction (ISO 9242-11, 1998).
Results from the 2 * 2 factors experiment significantly indicate how the new input device can reach the usability standards of expensive eyetrackers. This study demonstrates that the off-the-shelf-eyetracker can achieve efficiency similar to an expensive eyetracker with no significant effect from any of the tested factors. All four factors have significant impact on effectiveness. A factor that can eliminate the effectiveness difference between the standard hardware and an expensive eyetracker is identified. Another factor can additionally improve effectiveness.
Two gazetyping systems specifically designed for noisy conditions e.g. due to bad calibration and jolting are tested. StarGazer uses a zooming interface and GazeTalk uses large buttons in a static graphic user interface. GazeTalk is significantly more effective than StarGazer. The large onscreen buttons and static interface of GazeTalk with dwell time activation absorb the noise from the input device and typing speeds obtained are comparable to prior research with a regular eyetracker. Clickactivation has for years (Ware & Mikaelian 1987) proved to improve efficiency of gazebased interaction. This experiment demonstrates that this result significantly applies to off-the-shelf eyetrackers as well. The input device relies on the user to compensate for off-set with head movements. The keyboards should support this task with a static graphic user interface." Download thesis as pdf (in Danish)