Wednesday, May 6, 2009

COGAIN 2009 Program announced

This years Communication By Gaze Interaction conference is held on the 26th of May in Lyngby, Denmark in connection with the VisionDay (a four day event on computer vision). Registration for attending should be made on or before May 14th. Download program as pdf.

Update: the proceedings can be downloaded as pdf.


The program for May 26th
  • 08.00 Registration, exhibition, demonstrations, coffee, and rolls
SESSION I
  • 09.00 Welcome and introduction (Lars Pallesen, Rector @ DTU)
  • 09.10 Eye guidance in natural behaviour (B. W. Tatler)
  • 09.50 Achievements and experiences in the course of COGAIN (K. Raiha)
  • 10.30 Coffee, exhibition, demonstrations
SESSION II
  • 11.00 Joys and sorrows in communicating with gaze (A. Lykke-Larsen)
  • 11.30 An introduction to the 17 papers presented in the afternoon
  • 12.00 Lunch, exhibition, demonstrations, posters
SESSION III Track 1
SESSION III Track 2
14.50 Coffee, exhibition, demonstrations, posters

SESSION IV Track 1
  • 15.30 Gameplay experience in a gaze interaction game (L. Nacke, S. Stellmach, D. Sasse & C. A. Lindley)
  • 15.50 Select commands in 3D game environments by gaze gestures (S. Vickers, H. Istance & A. Hyrskykari)
  • 16.10 GazeTrain: A case study of an action oriented gaze-controlled game (L. F. Laursen & B. Ersbøll)
  • 16.30 Detecting Search and Rescue Targets in Moving Aerial Images using Eye-gaze (J. Mardell, M. Witkowski & R. Spence)
  • 16.50 Feasibility Study for the use of Eye-Movements in Estimation of Answer Correctness (M. Nakayama & Y. Hayashi)
SESSION IV Track 2
  • 15.30 Eye Tracker Connectivity (G. Daunys & V. Vysniauskas)
  • 15.50 SW tool supporting customization of eye tracking algorithms (P. Novák & O. Štepánková)
  • 16.10 Multimodal Gaze-Based Interaction (S. Trösterer & J. Dzaack)
  • 16.30 Gaze Visualization Trends and Techniques (S. Stellmach, L. Nacke, R. Dachselt & C. A. Lindley)
19.00 COGAIN2009 dinner at Brede Spisehus

The Dias Eye Tracker (Mardanbeigi, 2009)

Diako Mardanbeigi at the Iran University of Science & Technology introduces the Dias Eye Tracking suite. It is a low-cost solution employing a head mounted setup and comes with a rather extensive suite of applications. The software offers gaze control for playing games and music, viewing images, and text-to-speech using a dwell keyboard. It also offers basic eye movement recording and visualization such as scanpaths. The software is built using Visual Basic 6 and implements various algorithms for eye tracking including a rectangular method, RANSAC or LSQ ellipse/circle fitting. Additionally, there is support tracking one or two glints. The following video demonstrates the hardware and software. Congratulations Daiko on this great work!


Tuesday, May 5, 2009

Gaze-Augmented Manual Interaction (Bieg, H.J, 2009)

Hans-Joachim Bieg with the HCI Group at the University of Konstanz have investigated gaze augmented interaction on very large display areas. The prototype is running on the 221" Powerwall using a head mounted setup and allows users to select and zoom into an item of interest based on gaze position. An earlier video demonstration of setup can be found here.

"This project will demonstrate a new approach to employing users’ gaze in the context of human-computer interaction. This new approach uses gaze passively in order to improve the speed and precision of manually controlled pointing techniques. Designing such gaze augmented manual techniques requires an understanding of the principles that govern the coordination of hand and eye. This coordination is influenced by situational parameters (task complexity, input device used, etc.), which this project will explore in controlled experiments."

Gaze agumented interaction on the 221" PowerWall
  • Bieg, H. 2009. Gaze-augmented manual interaction. In Proceedings of the 27th international Conference Extended Abstracts on Human Factors in Computing Systems (Boston, MA, USA, April 04 - 09, 2009). CHI EA '09. ACM, New York, NY, 3121-3124. DOI= http://doi.acm.org/10.1145/1520340.1520442

Sunday, May 3, 2009

Laval VRchive @ Tokyo Metropolitan University

Hidenori Watanave at the Tokyo Metropolitan University have released a brief video demonstrating gaze interaction for the Laval VRchive. The VRchive is a virtual reality environment for navigating media content. The setup is using a standalone Tobii 1750 tracker and a projector. The simple interface allows gaze control through looking at either the top, bottom, left or right areas of the display area as well as winking to perform clicks. Althrough an early version the initial experiments were successful, but the software is unstable and needs further improvements.


Friday, May 1, 2009

Low-Cost Gaze Pointing and EMG Clicking

"Some severely disabled people are excluded from using gaze interaction because gaze trackers are usually expensive (above $10.000). In this paper we present a low-cost gaze pointer, which we have tested in combination with a desktop monitor and a wearable display. It is not as accurate as commercial gaze trackers, and walking while pointing with gaze on a wearable display turned out to be particularly difficult. However, in front of a desktop monitor it is precise enough to support communication. Supplemented with a commercial EMG switch it offers a complete hands-free, gaze-and-click control for less than $200."

  • San Agustin, J., Hansen, J. P., Hansen, D. W., and Skovsgaard, H. 2009. Low-cost gaze pointing and EMG clicking. In Proceedings of the 27th international Conference Extended Abstracts on Human Factors in Computing Systems (Boston, MA, USA, April 04 - 09, 2009). CHI EA '09. ACM, New York, NY, 3247-3252. DOI= http://doi.acm.org/10.1145/1520340.1520466

Gaze Controlled Driving

This is the paper on using eye trackers for remote robot navigation I had accepted for the CHI09 conference. It has now appeared on the ACM website. Note that the webcam tracker referred to in the paper is the ITU Gaze Tracker in an earlier incarnation. The main issue while using it is that head movements affect the gaze position and creates an offset. This is easier to correct and counterbalance on a static background than moving image (while driving!)

Abstract
"We investigate if the gaze (point of regard) can control a remote vehicle driving on a racing track. Five different input devices (on-screen buttons, mouse-pointing low-cost webcam eye tracker and two commercial eye tracking systems) provide heading and speed control on the scene view transmitted from the moving robot. Gaze control was found to be similar to mouse control. This suggests that robots and wheelchairs may be controlled ―hands-free‖ through gaze. Low precision gaze tracking and image transmission delays had noticeable effect on performance."

  • Tall, M., Alapetite, A., San Agustin, J., Skovsgaard, H. H., Hansen, J. P., Hansen, D. W., and Møllenbach, E. 2009. Gaze-controlled driving. In Proceedings of the 27th international Conference Extended Abstracts on Human Factors in Computing Systems (Boston, MA, USA, April 04 - 09, 2009). CHI EA '09. ACM, New York, NY, 4387-4392. DOI= http://doi.acm.org/10.1145/1520340.1520671

Monday, April 27, 2009

ITU Gaze Tracker: Low-cost gaze interaction: ready to deliver the promises (San Agustin, J et al., 2009)

The research paper on the ITU Gaze Tracker that Javier San Agustin presented at CHI09 is now available at the ACM website. It evaluates a previous version of the gaze tracker in two tasks, target acquisition and eye typing in comparison with mouse, SMI IViewX RED and the Tobii 1750.

Abstract
"Eye movements are the only means of communication for some severely disabled people. However, the high prices of commercial eye tracking systems limit the access to this technology. In this pilot study we compare the performance of a low-cost, web cam-based gaze tracker that we have developed with two commercial trackers in two different tasks: target acquisition and eye typing. From analyses on throughput, words per minute and error rates we conclude that a low-cost solution can be as efficient as expensive commercial systems."














  • San Agustin, J., Skovsgaard, H., Hansen, J. P., and Hansen, D. W. 2009. Low-cost gaze interaction: ready to deliver the promises. In Proceedings of the 27th international Conference Extended Abstracts on Human Factors in Computing Systems (Boston, MA, USA, April 04 - 09, 2009). CHI EA '09. ACM, New York, NY, 4453-4458.
    Download at ACM website.

A brief users guide to the ITU Gaze Tracker

Today we release a short users guide for the open source eye tracker we presented some weeks ago. Hopefully it will assist first time users to configure the software and understanding the limitations of the initial version. Comments and suggestions appreciated.


Friday, April 17, 2009

IDG Interview with Javier San Agustin

During the CHI09 in Boston last week Nick Barber from the IDG Network stopped by to record an interview with Javier San Agustin, member of the ITU GazeGroup. The video has now surfaced on several IDG sites around the world, clearly there is an interest for easy to use, low cost eye tracking. After the initial release of ITU Gaze Tracker we have setup a community forum at forum.gazegroup.org, with the ambition to connect users of open source eye tracking. If you like to be part of project, please join in promoting and developing an alternative. It´s open and accessible for all (platform documentation to be released in next week)

Hopefully, ideas and contributions to platform through the community makes the platform take off. Considering the initial release to be a Beta version, there are of course additional improvements to make. Additional cameras needs to be verified and bugs in code to be handled.

If you experience any issues or have ideas for improvements please post at http://forum.gazegroup.org



Computerworld.com.au

WebWereld.nl

PCAdvisor.co.uk

TechWorld.nl

IDG.no/ComputerWorld

ComputerWorld.dk

ComputerWorld.hu

ARNnet.com.au

Sunday, April 5, 2009

Introducing the ITU GazeTracker

The ITU Gaze Tracker is an open-source eye gaze tracking application that aims to provide a low-cost alternative to commercial gaze tracking systems and thereby making the technology more accessible. It is being developed by the Gaze Group at the IT University of Copenhagen, supported by the Communication by Gaze Interaction Association (COGAIN). The eye tracking software is video-based, and any camera equipped with infrared nightvision can be used, such as a video camera or a webcam. The cameras that have been tested with the system can be found in our forum.

Features:
  • Supports head mounted and remote setups
  • Tracks both pupil and glints
  • Supports a wide variety of camera devices
  • Configurable calibration
  • Eye-mouse capabilities
  • UDPServer broadcasting gaze data
  • Full source code provided


We encourage users and developers to test our software with their cameras and provide feedback so we can continue development. The ITU Gaze Tracker is released under the GLP3 open source license and the full source code is hosted at sourceforge. It´s written in C# using Emgu OpenCV wrapper for C++ image processing. (Microsoft .Net 3.5 needed) Once the tracker has been started it can be configured to broadcast gaze data via the UDP protocol which makes it easy to pick up in your own applications. We provide a sample implementation on a client in C#.

Open source eye tracking has never been easier. Download the binaries, plug the camera and launch the application. Adjust the sliders to match your camera and start the calibration.

Visit the ITU GazeGroup to download the software package. Please get in touch with us at http://forum.gazegroup.org