Tuesday, August 11, 2009

ALS Society of British Columbia announces Engineering Design Awards (Canadian students only)

"The ALS Society of British Columbia has established three Awards to encourage and recognize innovation in technology to substantially improve the quality of life of people living with ALS (Amyotrophic Lateral Sclerosis, also known as Lou Gehrig’s Disease). Students at the undergraduate or graduate level in engineering or a related discipline at a post-secondary institution in British Columbia or elsewhere in Canada are eligible for the Awards. Students may be considered individually or as a team. Mentor Awards may also be given to faculty supervising students who win awards" (see Announcement)


Project ideas:
  • Low-cost eye tracker
    • Issue: Current commercial eye-gaze tracking systems cost thousands to tens of thousands of dollars. The high cost of eye-gaze trackers prevents potential users from accessing eye- gaze tracking tools. The hardware components required for eye-gaze tracking do not justify the price and a lower-cost alternative is desirable. Webcams may be used for low-cost imaging, along with simple infrared diodes for system lighting. Alternatively, visible light systems may also be investigated. Opensource eye-gaze tracking software is also available. (ed: ITU GazeTracker, OpenEyes, Track Eye, OpenGazer and MyEye (free, no source)
    • Goal: The goal of this design project is to develop a low-cost and usable eye-gaze tracking system based on simple commercial-of-the-shelf hardware.
    • Deliverables: A working prototype of a functional, low-cost (< $200), eye-gaze tracking system.
  • Eye-glasses compensation
    • Deliverables: A working prototype of a functional, low-cost (< $200), eye-gaze tracking system
    • Issue: The use of eye-glasses can cause considerable problems in eye-gaze tracking. The issue stems from reflections off the eye-glasses due to the use of controlled infrared lighting (on and off axis light sources) used to highlight features of the face. The key features of interest are the pupils and glints (or reflections of the surface of the cornea). Incorrectly identifying the pupils and glints then results in invalid estimation of the point-of-gaze.
    • Goal: The goal of this design project is to develop techniques for either: 1) avoiding image corruption with eye-glasses on a commercial eye-gaze tracker, or 2) developing a controlled lighting scheme to ensure valid pupil and glints identification are identified in the presence of eye-glasses.
    • Deliverables: Two forms of deliverables are possible: 1) A working prototype illustrating functional eye-gaze tracking in the presence of eye-glasses with a commercial eye-gaze tracker, or 2) A working prototype illustrating accurate real-time identification of the pupil and glints using controlled infrared lighting (on and off axis light sources) in the presence of eye-glasses.
  • Innovative selection with ALS and eye gaze
    • Issue: As mobility steadily decreases in the more advanced stages of ALS, alternative techniques for selection are required. Current solutions include head switches, sip and puff switches and dwell time activation depending on the degree of mobility loss to name a few. The use of dwell time requires no mobility other than eye motion, however, this technique suffers from ‘lag’ in that the user must wait the dwell time duration for each selection, as well as the ‘midas touch’ problem in which unintended selection if the gaze point is stationary for too long.
    • Goal: The goal of this design project is to develop a technique for improved selection with eye-gaze for individuals with only eye-motion available. Possible solutions may involve novel HCI designs for interaction, including various adaptive and predictive technologies, the consideration of contextual cues, and the introduction of ancillary inputs, such as EMG, EEG.
    • Deliverables: A working prototype illustrating eye-motion only selection with a commercial eye-gaze tracking system.
  • Novel and valuable eye-gaze tracking applications and application enhancements
    • Issue: To date, relatively few gaze-tracking applications have been developed. These include relatively simplistic applications such as the tedious typing of words, and even in such systems, little is done to ease the effort required, e.g., systems typically do not allow for the saving and reuse of words and sentences.
    • Goal: The goal of this design project is to develop one or more novel applications or application enhancements that take gaze as input, and that provide new efficiencies or capabilities that could significantly improve the quality of life of those living with ALS.
    • Deliverables: A working prototype illustrating one or more novel applications that take eye-motion as an input. The prototype must be developed and implemented to the extent that an evaluation of the potential efficiencies and/or reductions in effort can be evaluated by persons living with ALS and others on an evaluation panel.

    See the Project Ideas for more information. For contact information see page two of the announcement.

Lund Eye-Tracking Academy (LETA)

Kenneth Holmqvist and his team at the Humanities Lab at Lund University, Sweden will host another three day long LETA training course in eye tracking and analysis of eye movement data. This is an excellent opportunity to get hands-on experience using state-of-the art equipment and setting up experiments. The course is held between 23rd-25th September and the registration is open.

Course contents
• Pro and cons of headmounted, remote and contact eye-trackers.
• High sampling speed and detailed precision – who needs it?
• Gaze-overlaid videos vs datafiles – what can you do with them?
• How to set up and calibrate on a variety of subjects on different eye-trackers?
• Glasses, lenses, mascara, and drooping eye-lids – what to do?
• How to work with stimulus programs, and synchronize them with eye-tracking recording?
• How to deal with the consent forms and ethical issues?
• Short introduction to experimental design: Potentials and pitfalls.
• Visualisation of data vs number crunching.
• Fast data analysis of multi-user experiments.
• Fixation durations, saccadic amplitudes, transition diagrams, group similarity measures, and all the other measures – what do they tell us? What are the pitfalls?

Teaching methods
Lectures on selected topics (8h)
Hands-on work in our lab on prespecified experiments: Receiving and recording on a subject (9h). Handson initial data analysis (3h).

Eye-tracking systems available for this training
2*SMI HED 50 Hz with Polhemus Head-tracking
3*SMI HiSpeed 240/1250 Hz
1*SMI RED-X remote 50 Hz
2*SMI HED-mobile 50/200 Hz

Thursday, August 6, 2009

Päivi Majaranta PhD Thesis on Text Entry by Eye Gaze

The most complete publication on gaze typing is now available as Päivi Majaranta at the University of Tampere have successfully defended her PhD thesis. It summarizes previous work and discusses/exemplifies important topics such as word prediction, layout, feedback and user aspects. The material is presented in a straight forward manner with a clear structure and excellent illustrations. It will without doubt be useful for anyone who is about to design and develop a gaze based text entry interface. Congratulations Päivi for such an well written thesis.



Friday, July 31, 2009

SMI RED 250!


Today SMI announced the new RED250 which, as the name suggests, has an impressive 250Hz sampling rate. It has an accuracy of 0.5 degrees or below (typ.), less than 10 ms. latency and operates within 60-80 cm head distance. The track-box is 40x40 cm at 70cm distance and will recover tracking faster than the previous model. No details on pricing yet but top of the line performance never comes cheap. Get the flyer as pdf.

Survey on gaze visualization in 3D virtual environments

Got an email today from Sophie Stellmach a PhD student from the User Interface & Software Engineering group at the Otto-von-Guericke University in Germany. She has posted an online survey and would like your some input from eye tracking specialists on 3D gaze visualization.

"In the course of my work I have developed several gaze visualizations for facilitating eye tracking studies in (static) three-dimensional virtual environments. In order to evaluate the potential utility of these techniques, I am conducting an online survey with eye tracking researchers and professionals. I would like to invite you to this survey as I think that your answers are highly valuable for this investigation. The survey should take less than 10 minutes of your time! Your answers will be stored anonymously. You can access the survey under the following link: http://gamescience.bth.se/survey/index.php?sid=27319=en "

Wednesday, July 22, 2009

Telegaze update

Remember the TeleGaze robot developed by Hemin Omer which I wrote about last September? Today there is a new video available showing an updated interface which appears to be somewhat improved, no further information is available.
Update: The new version includes an automatic "person-following" mode which can be turned on or off through the interface. See video below

Gaze Interaction in Immersive Virtual Reality - 3D Eye Tracking in Virtual Worlds

Thies Pfeiffer (blog) working in the A.I group at the Faculty of technology, Bielefeld University in Germany have presented some interesting research on 3D gaze interaction in virtual environments. As the video demonstrates they have achieved high accuracy for gaze based pointing and selection. This opens up for a wide range of interesting man-machine interaction where digital avatars may mimic natural human behavior. Impressive.



Publications
  • Pfeiffer, T. (2008). Towards Gaze Interaction in Immersive Virtual Reality: Evaluation of a Monocular Eye Tracking Set-Up. In Virtuelle und Erweiterte Realität - Fünfter Workshop der GI-Fachgruppe VR/AR, 81-92. Aachen: Shaker Verlag GmbH. [Abstract] [BibTeX] [PDF]
  • Pfeiffer, T., Latoschik, M.E. & Wachsmuth, I. (2008). Evaluation of Binocular Eye Trackers and Algorithms for 3D Gaze Interaction in Virtual Reality Environments. Journal of Virtual Reality and Broadcasting, 5 (16), dec. [Abstract] [BibTeX] [URL] [PDF]
  • Pfeiffer, T., Donner, M., Latoschik, M.E. & Wachsmuth, I. (2007). 3D fixations in real and virtual scenarios. Journal of Eye Movement Research, Special issue: Abstracts of the ECEM 2007, 13.
  • Pfeiffer, T., Donner, M., Latoschik, M.E. & Wachsmuth, I. (2007). Blickfixationstiefe in stereoskopischen VR-Umgebungen: Eine vergleichende Studie. In Vierter Workshop Virtuelle und Erweiterte Realität der GI-Fachgruppe VR/AR, 113-124. Aachen: Shaker. [Abstract] [BibTeX] [PDF]
List of all publications available here.

Wednesday, July 15, 2009

Gaze & Voice recognition game development blog

Jonathan O'Donovan, a masters student in Interactive Entertainment Technology at the Trinity College in Dublin, have recently started a blog for his thesis. It will combine gaze and voice recognition for developing a new video game. So far the few posts available have mainly concerned the underlying framework but a proof-of-concept combining gaze and voice is demonstrated. The project will be developed on a Microsoft Windows based platform and utilizes the XNA game development framework for graphics and the Microsoft Speech SDK for voice input. The eye tracker of choice is a Tobii T60 provided by Acuity ETS (Reading, UK). The thesis will be supervised by Veronica Sundstedt at the Trinity College Computer Science dept.
Keep us posten Jonathan, excitied to see what you'll come up with!





Update: 
The project resulted in the Rabbit Run game which is documented in the following publication:

  • J. O’Donovan, J. Ward, S. Hodgins, V. Sundstedt (2009) Rabbit Run: Gaze and Voice Based Game Interaction (PDF). 

Monday, July 13, 2009

Oculis labs Chameleon prevents over shoulder reading

"Two years ago computer security expert Bill Anderson read about scientific research on how the human eye moves as it reads and processes text and images. 'This obscure characteristic... suddenly struck me as (a solution to) a security problem,' says Anderson. With the help of a couple of software developers, Anderson developed a software program called Chameleon that tracks a viewer's gaze patterns and only allows an authorized user to read text on the screen, while everyone else sees gibberish. Chameleon uses gaze-tracking software and camera equipment to track an authorized reader's eyes to show only that one person the correct text. After a 15-second calibration period in which the software learns the viewer's gaze patterns, anyone looking over that user's shoulder sees dummy text that randomly and constantly changes. To tap the broader consumer market, Anderson built a more consumer-friendly version called PrivateEye, which can work with a simple Webcam to blur a user's monitor when he or she turns away. It also detects other faces in the background, and a small video screen pops up to alert the user that someone is looking at the screen. 'There have been inventions in the space of gaze-tracking. There have been inventions in the space of security,' says Anderson. 'But nobody has put the two ideas together, as far as we know.'" (source)

Patent application
Article by Baltimore Sun

Monday, June 29, 2009

Video from COGAIN2009

John Paulin Hansen has posted a video showing some highlights from the annual COGAIN conference. It demonstrates three available gaze interaction solutions, the COGAIN GazeTalk interface, Tobii Technologies MyTobii and Alea Technologies IG-30. These interfaces relies on dwell-activated on-screen keyboards ( i.e. same procedure as last year).


Monday, June 1, 2009

COGAIN 2009 Proceedings now online

There is little reason to doubt the vitality of the COGAIN network. This years proceedings presents an impressive 18 papers spread out over one hundred pages. It covers a wide range of areas from low-cost eye tracking, text entry, gaze input for gaming, multimodal interaction to environment control, clinical assessments and case studies. Unfortunately I was unable to attend the event this year (recently relocated) but with the hefty proceedings being available online there is plenty of material to be read through (program and links to authors here) Thanks to Arantxa Villanuevua, John Paulin Hansen and Bjarne Kjaer Ersboll for the editorial effort.

Tuesday, May 26, 2009

Toshiba eye tracking for automotive applications

Seen this one coming for a while. Wonder how stable it would be in a real-life scenario..
Via Donald Melanson at Engadget:
"We've seen plenty of systems that rely on facial recognition for an interface, but they've so far been a decidedly rarer occurrence when it comes to in-car systems. Toshiba looks set to change that, however, with it now showing off a new system that'll not only let you control the A/C or radio with the glance of your eye, but alert you if you happen to take your eyes off the road for too long. That's done with the aid of a camera mounted above the steering wheel that's used to identify and map out the driver's face, letting the car (or desktop PC in this demonstration) detect everything from head movement and eye direction to eyelid blinks, which Toshiba says could eventually be used to alert drowsy drivers. Unfortunately, Toshiba doesn't have any immediate plans to commercialize the technology, although it apparently busily working to make it more suited for embedded CPUs." (source)

Tuesday, May 19, 2009

Hands-free Interactive Image Segmentation Using Eyegaze (Sadeghi, M. et al, 2009)

Maryam Sadeghi, a Masters student at the Medical Image Analysis Lab at the Simon Fraser University in Canada presents an interesting paper on using eye tracking for gaze driven image segmentation. The research has been performed in cooperation with Geoffry Thien (Ph.D student), Dr. Hamarneh and Stella Atkins (principal investigators). More information is to be published on this page. Geoffry Thien completed his M.Sc thesis on gaze interaction in March under the title "Building Interactive Eyegaze Menus for Surgery" (abstract) unfortunately I have not been able to located a electronic copy of that document.

Abstract
"This paper explores a novel approach to interactive user-guided image segmentation, using eyegaze information as an input. The method includes three steps: 1) eyegaze tracking for providing user input, such as setting object and background seed pixel selection; 2) an optimization method for image labeling that is constrained or affected by user input; and 3) linking the two previous steps via a graphical user interface for displaying the images and other controls to the user and for providing real-time visual feedback of eyegaze and seed locations, thus enabling the interactive segmentation procedure. We developed a new graphical user interface supported by an eyegaze tracking monitor to capture the user's eyegaze movement and fixations (as opposed to traditional mouse moving and clicking). The user simply looks at different parts of the screen to select which image to segment, to perform foreground and background seed placement and to set optional segmentation parameters. There is an eyegaze-controlled "zoom" feature for difficult images containing objects with narrow parts, holes or weak boundaries. The image is then segmented using the random walker image segmentation method. We performed a pilot study with 7 subjects who segmented synthetic, natural and real medical images. Our results show that getting used the new interface takes about only 5 minutes. Compared with traditional mouse-based control, the new eyegaze approach provided a 18.6% speed improvement for more than 90% of images with high object-background contrast. However, for low contrast and more difficult images it took longer to place seeds using the eyegaze-based "zoom" to relax the required eyegaze accuracy of seed placement." Download paper as pdf.

The custom interface is used to place backgound (red) and object (green) seeds which are used in the segmentation process. The custom fixation detection algorithm triggers a mouse click to the gaze position, if 20 of the previous 30 gaze samples lies within a a 50 pixel radius.


The results indicate a certain degree of feasibility for gaze assisted segmentation, however real-life situations often contain more complex images where borders of objects are less defined. This is also indicated in the results where the CT brain scan represents the difficult category. For an initial study the results are interesting and it's likely that we'll see more of gaze interaction within domain specific applications in a near future.


  • Maryam Sadeghi, Geoff Tien, Ghassan Hamarneh, and Stella Atkins. Hands-free Interactive Image Segmentation Using Eyegaze. In SPIE Medical Imaging 2009: Computer-Aided Diagnosis. Proceedings of the SPIE, Volume 7260 (pdf)

Wednesday, May 13, 2009

GaCIT 2009 : Summer School on Gaze, Communication, and Interaction Technology

"The GaCIT summer school offers an intensive one-week camp where doctoral students and researchers can learn and refresh skills and knowledge related to gaze-based text entry under the tutelage of leading experts in the area. The program will include theoretical lectures and hands-on exercises, an opportunity for participants to present their own work, and a social program enabling participants to exchange their experiences in a relaxing and inspiring atmosphere."

The GaCIT workshop is organized by the graduate school on User-Centered Information Technology at the University of Tampere, Finland (map). The workshop runs between July 27-31. I attended last year and found it to be great week with interesting talks and social events. See the day-by-day coverage of the GaCIT 2008.

Topics and speakers:
  • Introduction to Gaze-based Communication (Howell Istance)

  • Evaluation of Text Entry Techniques (Scott MacKenzie)
    Survey of text entry methods. Models, metrics, and procedures for evaluating text entry methods.

  • Details of Keyboards and Users Matter (Päivi Majaranta)
    Issues specific to eye-tracker use of soft keyboards, special issues in evaluating text entry techniques with users that use eye trackers for communication.

  • Communication by Eyes without Computers (TBA)
    Introduction to eye-based communication using low-tech devices.

  • Gesture-based Text Entr Techniques (Poika Isokoski)
    Overview of studies evaluating techniques such as Dasher, QuikWrite and EdgeWrite in the eye-tracker context

  • Low-cost Devices and the Future of Gaze-based Text Entry (John Paulin Hansen)
    Low-cost eye tracking and its implications for text entry systems. Future of gaze-based text entry.

  • Dwell-free text entry techniques (Anke Huckauf)
    Introduction to gaze-based techniques that do not utilize the dwell-time protocol for item selection.

Visit the GaCIT 2009 website for more information.

Hi –fi eyetracking with a lo-fi eyetracker: An experimental usability study of an eyetracker built from a standard web camara (Barret, M., 2009)

Marie Barret, a masters student at the ITU Copenhagen have now finished her thesis. It evaluates eye typing performance using the ITU Gaze Tracker (low-cost web cam eye tracker) in the Stargazer and GazeTalk interfaces. The thesis in written in Danish (113 pages) but I took the freedom of translating two charts from the thesis found below. The results will be presented in English at the COGAIN 2009 conference, May 26th (session three, track one at 1:50PM) For now I quote the abstract:

"Innovation has facilitated sufficient mainstream technology to build eyetrackers from off-the-shelf-components. Prices for standard eyetrackers start at around € 4000. This thesis describes an experimental usabilty study of gazetyping with a new input device built from a standard web camera without hardware modifications. Cost: € 20. Mainstreaming of assistive technologies holds potential for faster innovation, better service, lower prices and increased accessibility. Off-the-shelf-eyetrackers must be usability competitive to standard eyetrackers in order to be adopted, as eyetracking - even with expensive hardware - presents usability issues. Usability is defined as effectiveness, efficiency and user satisfaction (ISO 9242-11, 1998).

Results from the 2 * 2 factors experiment significantly indicate how the new input device can reach the usability standards of expensive eyetrackers. This study demonstrates that the off-the-shelf-eyetracker can achieve efficiency similar to an expensive eyetracker with no significant effect from any of the tested factors. All four factors have significant impact on effectiveness. A factor that can eliminate the effectiveness difference between the standard hardware and an expensive eyetracker is identified. Another factor can additionally improve effectiveness.

Two gazetyping systems specifically designed for noisy conditions e.g. due to bad calibration and jolting are tested. StarGazer uses a zooming interface and GazeTalk uses large buttons in a static graphic user interface. GazeTalk is significantly more effective than StarGazer. The large onscreen buttons and static interface of GazeTalk with dwell time activation absorb the noise from the input device and typing speeds obtained are comparable to prior research with a regular eyetracker. Clickactivation has for years (Ware & Mikaelian 1987) proved to improve efficiency of gazebased interaction. This experiment demonstrates that this result significantly applies to off-the-shelf eyetrackers as well. The input device relies on the user to compensate for off-set with head movements. The keyboards should support this task with a static graphic user interface." Download thesis as pdf (in Danish)

Tuesday, May 12, 2009

BBC News: The future of gadget interaction

Dan Simmons at BBC reports on future technologies from the Science Beyond Fiction 2009 conference in Prague. The news headline includes a section on the GazeCom project who won the 2nd prize for their exhibit "Gaze-contingent displays and interaction". Their website hosts additional demonstrations.

"Gaze tracking is well-established and has been used before now by online advertisers who use it to decide the best place to put an advert. A novel use of the system tracks someone's gaze and brings into focus the area of a video being watched by blurring their peripheral vision.In the future, the whole image could also be panned left or right as the gaze approaches the edge of the screen. Film producers are interested in using the system to direct viewers to particular parts within a movie. However, interacting with software through simply looking will require accurate but unobtrusive eye tracking systems that, so far, remain on the drawing board... The European Commission (EC) is planning to put more cash into such projects. In April it said it would increase its investment in this field from 100m to 170m euros (£89m-£152m) by 2013. " (BBC source ) More information about the EC CORDIS : ICT program.

External link. The BBC reported Dan Simmons tests a system designed to use a driver's peripheral vision to flag up potential dangers on the road. It was recorded at the Science Beyond Fiction conference in Prague.

The GazeCom project involves the following partners:

ETRA 2010 Call for papers

ETRA 2010 will be the sixth biennial symposium in a series that focuses on all aspects of eye movement research across a wide range of disciplines. The goal of ETRA is to bring together computer scientists, engineers and behavioral scientists in support of a common vision of enhancing eye tracking research and applications. ETRA 2010 is being organized in conjunction with the European Communication by Gaze Interaction (COGAIN) research network that specializes in gaze-based interaction for the benefit of people with physical disabilities.

Update: List of accepted and presented papers.

Symposium Themes
  • Advances in Eye Tracking Technology and Data Analysis
    Eye tracking systems, calibration algorithms, data analysis techniques, noise reduction, predictive models, 3D POR measurement, low cost and natural light systems.
  • Visual Attention and Eye Movement Control
    Studies of eye movements in response to natural stimuli, driving studies, web use and usability studies.
  • Eye Tracking Applications
    Gaze-contingent displays, attentive user interfaces, gaze-based interaction techniques, security systems, multimodal interfaces, augmented and mixed reality systems, ubiquitous computing.
  • Special Theme: Eye Tracking and Accessibility
    Eye tracking has proved to be an effective means of making computers more accessible when the use of keyboards and mice is hindered by the task itself (such as driving), or by physical disabilities. We invite submissions that explore new methodological strategies, applications, and results that use eye tracking in assistive technologies for access to desktop applications, for environment and mobility control, and for gaze control of games and entertainment..
Two categories of submissions are being sought – Full Papers and Short Papers.
Full papers must be submitted electronically through the ETRA 2010 website and conform to the ACM SIGGRAPH proceedings category 2 format. Full papers submissions can have a maximum length of eight pages. Full papers submissions should be made in double-blind format, hiding authors’ names and affiliations and all references to the authors’ previous work. Those wishing to submit a full paper must submit an abstract in advance to facilitate the reviewing process. Accepted papers will be published in the ETRA 2010 proceedings, and the authors will give a 20 minute oral presentation of the paper at the conference.

Short papers may present work that has smaller scope than a full paper or may present late breaking results. These must be submitted electronically through the ETRA 2010 submission website and conform to the ACM SIGGRAPH proceedings category 3 format. Short paper submissions have a maximum length of four pages (but can be as short as a one-page abstract). Given the time constraints of this type of paper, submissions must be made in camera-ready format including authors' names and affiliations. Accepted submissions will be published in the ETRA 2010 proceedings. Authors will present a poster at the conference, and authors of the most highly rated submissions will give a 10 minute presentation of the paper in a Short Papers session. All submissions will be peer-reviewed by members of an international review panel and members of the program committee. Best Paper Awards will be given to the most highly ranked Full Papers and Short Papers.

Full Papers Deadlines
  • Sep. 30th, 2009 Full Papers abstract submission deadline
  • Oct. 7th, 2009 Full Papers submission deadline
  • Nov. 13th, 2009 Acceptance notification
Short Papers Deadlines
  • Dec. 2th, 2009 Short Papers submission deadline
  • Jan. 8th, 2010 Short Papers acceptance notification
  • Jan. 15th, 2010 All camera ready papers due
More information on the ETRA website.

Thursday, May 7, 2009

Interactive Yarbus at MU, Netherlands

An interactive art exhibition by Christien Meindertsma in the Netherlands opens up for a real time generation of scanpaths to draw images similar to the ones presented in classic Yarbus paper. The main purpose is to illustrate individual differences in the way we look at objects (such as faces, umbrellas, cups etc.) These images are then printed directly and becomes a part of the exhibition. The exhibition runs until June 14th (location: Eindhoven).

Scanpath from the Yarbus (1967) for comparison.

Wednesday, May 6, 2009

COGAIN 2009 Program announced

This years Communication By Gaze Interaction conference is held on the 26th of May in Lyngby, Denmark in connection with the VisionDay (a four day event on computer vision). Registration for attending should be made on or before May 14th. Download program as pdf.

Update: the proceedings can be downloaded as pdf.


The program for May 26th
  • 08.00 Registration, exhibition, demonstrations, coffee, and rolls
SESSION I
  • 09.00 Welcome and introduction (Lars Pallesen, Rector @ DTU)
  • 09.10 Eye guidance in natural behaviour (B. W. Tatler)
  • 09.50 Achievements and experiences in the course of COGAIN (K. Raiha)
  • 10.30 Coffee, exhibition, demonstrations
SESSION II
  • 11.00 Joys and sorrows in communicating with gaze (A. Lykke-Larsen)
  • 11.30 An introduction to the 17 papers presented in the afternoon
  • 12.00 Lunch, exhibition, demonstrations, posters
SESSION III Track 1
SESSION III Track 2
14.50 Coffee, exhibition, demonstrations, posters

SESSION IV Track 1
  • 15.30 Gameplay experience in a gaze interaction game (L. Nacke, S. Stellmach, D. Sasse & C. A. Lindley)
  • 15.50 Select commands in 3D game environments by gaze gestures (S. Vickers, H. Istance & A. Hyrskykari)
  • 16.10 GazeTrain: A case study of an action oriented gaze-controlled game (L. F. Laursen & B. Ersbøll)
  • 16.30 Detecting Search and Rescue Targets in Moving Aerial Images using Eye-gaze (J. Mardell, M. Witkowski & R. Spence)
  • 16.50 Feasibility Study for the use of Eye-Movements in Estimation of Answer Correctness (M. Nakayama & Y. Hayashi)
SESSION IV Track 2
  • 15.30 Eye Tracker Connectivity (G. Daunys & V. Vysniauskas)
  • 15.50 SW tool supporting customization of eye tracking algorithms (P. Novák & O. Štepánková)
  • 16.10 Multimodal Gaze-Based Interaction (S. Trösterer & J. Dzaack)
  • 16.30 Gaze Visualization Trends and Techniques (S. Stellmach, L. Nacke, R. Dachselt & C. A. Lindley)
19.00 COGAIN2009 dinner at Brede Spisehus

The Dias Eye Tracker (Mardanbeigi, 2009)

Diako Mardanbeigi at the Iran University of Science & Technology introduces the Dias Eye Tracking suite. It is a low-cost solution employing a head mounted setup and comes with a rather extensive suite of applications. The software offers gaze control for playing games and music, viewing images, and text-to-speech using a dwell keyboard. It also offers basic eye movement recording and visualization such as scanpaths. The software is built using Visual Basic 6 and implements various algorithms for eye tracking including a rectangular method, RANSAC or LSQ ellipse/circle fitting. Additionally, there is support tracking one or two glints. The following video demonstrates the hardware and software. Congratulations Daiko on this great work!


Tuesday, May 5, 2009

Gaze-Augmented Manual Interaction (Bieg, H.J, 2009)

Hans-Joachim Bieg with the HCI Group at the University of Konstanz have investigated gaze augmented interaction on very large display areas. The prototype is running on the 221" Powerwall using a head mounted setup and allows users to select and zoom into an item of interest based on gaze position. An earlier video demonstration of setup can be found here.

"This project will demonstrate a new approach to employing users’ gaze in the context of human-computer interaction. This new approach uses gaze passively in order to improve the speed and precision of manually controlled pointing techniques. Designing such gaze augmented manual techniques requires an understanding of the principles that govern the coordination of hand and eye. This coordination is influenced by situational parameters (task complexity, input device used, etc.), which this project will explore in controlled experiments."

Gaze agumented interaction on the 221" PowerWall
  • Bieg, H. 2009. Gaze-augmented manual interaction. In Proceedings of the 27th international Conference Extended Abstracts on Human Factors in Computing Systems (Boston, MA, USA, April 04 - 09, 2009). CHI EA '09. ACM, New York, NY, 3121-3124. DOI= http://doi.acm.org/10.1145/1520340.1520442

Sunday, May 3, 2009

Laval VRchive @ Tokyo Metropolitan University

Hidenori Watanave at the Tokyo Metropolitan University have released a brief video demonstrating gaze interaction for the Laval VRchive. The VRchive is a virtual reality environment for navigating media content. The setup is using a standalone Tobii 1750 tracker and a projector. The simple interface allows gaze control through looking at either the top, bottom, left or right areas of the display area as well as winking to perform clicks. Althrough an early version the initial experiments were successful, but the software is unstable and needs further improvements.


Friday, May 1, 2009

Low-Cost Gaze Pointing and EMG Clicking

"Some severely disabled people are excluded from using gaze interaction because gaze trackers are usually expensive (above $10.000). In this paper we present a low-cost gaze pointer, which we have tested in combination with a desktop monitor and a wearable display. It is not as accurate as commercial gaze trackers, and walking while pointing with gaze on a wearable display turned out to be particularly difficult. However, in front of a desktop monitor it is precise enough to support communication. Supplemented with a commercial EMG switch it offers a complete hands-free, gaze-and-click control for less than $200."

  • San Agustin, J., Hansen, J. P., Hansen, D. W., and Skovsgaard, H. 2009. Low-cost gaze pointing and EMG clicking. In Proceedings of the 27th international Conference Extended Abstracts on Human Factors in Computing Systems (Boston, MA, USA, April 04 - 09, 2009). CHI EA '09. ACM, New York, NY, 3247-3252. DOI= http://doi.acm.org/10.1145/1520340.1520466

Gaze Controlled Driving

This is the paper on using eye trackers for remote robot navigation I had accepted for the CHI09 conference. It has now appeared on the ACM website. Note that the webcam tracker referred to in the paper is the ITU Gaze Tracker in an earlier incarnation. The main issue while using it is that head movements affect the gaze position and creates an offset. This is easier to correct and counterbalance on a static background than moving image (while driving!)

Abstract
"We investigate if the gaze (point of regard) can control a remote vehicle driving on a racing track. Five different input devices (on-screen buttons, mouse-pointing low-cost webcam eye tracker and two commercial eye tracking systems) provide heading and speed control on the scene view transmitted from the moving robot. Gaze control was found to be similar to mouse control. This suggests that robots and wheelchairs may be controlled ―hands-free‖ through gaze. Low precision gaze tracking and image transmission delays had noticeable effect on performance."

  • Tall, M., Alapetite, A., San Agustin, J., Skovsgaard, H. H., Hansen, J. P., Hansen, D. W., and Møllenbach, E. 2009. Gaze-controlled driving. In Proceedings of the 27th international Conference Extended Abstracts on Human Factors in Computing Systems (Boston, MA, USA, April 04 - 09, 2009). CHI EA '09. ACM, New York, NY, 4387-4392. DOI= http://doi.acm.org/10.1145/1520340.1520671

Monday, April 27, 2009

ITU Gaze Tracker: Low-cost gaze interaction: ready to deliver the promises (San Agustin, J et al., 2009)

The research paper on the ITU Gaze Tracker that Javier San Agustin presented at CHI09 is now available at the ACM website. It evaluates a previous version of the gaze tracker in two tasks, target acquisition and eye typing in comparison with mouse, SMI IViewX RED and the Tobii 1750.

Abstract
"Eye movements are the only means of communication for some severely disabled people. However, the high prices of commercial eye tracking systems limit the access to this technology. In this pilot study we compare the performance of a low-cost, web cam-based gaze tracker that we have developed with two commercial trackers in two different tasks: target acquisition and eye typing. From analyses on throughput, words per minute and error rates we conclude that a low-cost solution can be as efficient as expensive commercial systems."














  • San Agustin, J., Skovsgaard, H., Hansen, J. P., and Hansen, D. W. 2009. Low-cost gaze interaction: ready to deliver the promises. In Proceedings of the 27th international Conference Extended Abstracts on Human Factors in Computing Systems (Boston, MA, USA, April 04 - 09, 2009). CHI EA '09. ACM, New York, NY, 4453-4458.
    Download at ACM website.

A brief users guide to the ITU Gaze Tracker

Today we release a short users guide for the open source eye tracker we presented some weeks ago. Hopefully it will assist first time users to configure the software and understanding the limitations of the initial version. Comments and suggestions appreciated.


Friday, April 17, 2009

IDG Interview with Javier San Agustin

During the CHI09 in Boston last week Nick Barber from the IDG Network stopped by to record an interview with Javier San Agustin, member of the ITU GazeGroup. The video has now surfaced on several IDG sites around the world, clearly there is an interest for easy to use, low cost eye tracking. After the initial release of ITU Gaze Tracker we have setup a community forum at forum.gazegroup.org, with the ambition to connect users of open source eye tracking. If you like to be part of project, please join in promoting and developing an alternative. It´s open and accessible for all (platform documentation to be released in next week)

Hopefully, ideas and contributions to platform through the community makes the platform take off. Considering the initial release to be a Beta version, there are of course additional improvements to make. Additional cameras needs to be verified and bugs in code to be handled.

If you experience any issues or have ideas for improvements please post at http://forum.gazegroup.org



Computerworld.com.au

WebWereld.nl

PCAdvisor.co.uk

TechWorld.nl

IDG.no/ComputerWorld

ComputerWorld.dk

ComputerWorld.hu

ARNnet.com.au

Sunday, April 5, 2009

Introducing the ITU GazeTracker

The ITU Gaze Tracker is an open-source eye gaze tracking application that aims to provide a low-cost alternative to commercial gaze tracking systems and thereby making the technology more accessible. It is being developed by the Gaze Group at the IT University of Copenhagen, supported by the Communication by Gaze Interaction Association (COGAIN). The eye tracking software is video-based, and any camera equipped with infrared nightvision can be used, such as a video camera or a webcam. The cameras that have been tested with the system can be found in our forum.

Features:
  • Supports head mounted and remote setups
  • Tracks both pupil and glints
  • Supports a wide variety of camera devices
  • Configurable calibration
  • Eye-mouse capabilities
  • UDPServer broadcasting gaze data
  • Full source code provided


We encourage users and developers to test our software with their cameras and provide feedback so we can continue development. The ITU Gaze Tracker is released under the GLP3 open source license and the full source code is hosted at sourceforge. It´s written in C# using Emgu OpenCV wrapper for C++ image processing. (Microsoft .Net 3.5 needed) Once the tracker has been started it can be configured to broadcast gaze data via the UDP protocol which makes it easy to pick up in your own applications. We provide a sample implementation on a client in C#.

Open source eye tracking has never been easier. Download the binaries, plug the camera and launch the application. Adjust the sliders to match your camera and start the calibration.

Visit the ITU GazeGroup to download the software package. Please get in touch with us at http://forum.gazegroup.org

Tuesday, March 31, 2009

Radio interview with DR1

Thomas Behrndtz from the Danish Radio (DR1) came by the other day to do an interview on the upcoming ITU Gaze Interaction platform. It resulted in a five minute episode on the "Videnskaben kort", a radio program on interesting progress in science. Lately we have been working hard on the software package which is to be released at CHI09 in Boston next week. It includes a number of applications and tools that are to be released for free download including source code under the GPL licence. In short, these are exciting times for low-cost eye tracking and gaze interaction. Stay tuned..

Click on image to hear the radio interview (in Danish/Swedish)

Thursday, March 12, 2009

The Argentinian myEye released

I have been following an interesting project taking place in Argentina during the last half year. Marcelo Laginestra have through his blog described the developments of a low-cost webcam based eye tracker. It has now been released for download, free of charge.

The system requirements are modest,
  • CPU: 1.5 Ghz or higher
  • RAM: 256 DDR RAM or higher (Recommendation 512 RAM)
  • Space: at least 100MB hard disk space.
  • Camera: 640x480 capture resolution or higher. (At least 30fps)
  • O.S.: Microsoft Windows XP SP2
Go to the myEye website to download the software.

I am happy to see that the project came through, kudos for releasing under Creative Commons.

Keep an eye open for the ITU gaze interaction platform that will be released in conjunction with CHI09 in early April.

Monday, February 16, 2009

ID-U Biometrics: Eye movement based access control

Daphna Palti-Wasserman and Yoram Wasserman at ID-U Biometrics have developed a system which provides secure signatures to access control based on individual eye movement patterns. The subject’s response to a dynamic stimuli provides an unique characteristics. As the stimuli will change the subjects’ responses will be different each time but the pattern of eye movements and the users eye characteristics will remains the same. This results in a "code" which is not entered and not consciously controlled by the user which reduces issues of spoofing. Currently its in a proof-of-concept state, achieving a 100% accurate and stable eye tracking method which would be required for identification has yet to be achieved (by any eye tracking platform that is) However, this method of user identification could be applied in other situations than the ATM (I guess that's why they won the GLOBES start-up competition)

Links:

Tuesday, February 10, 2009

COGAIN 2009 (26th May) "Gaze interaction for those who want it most".

"The 5th international COGAIN conference on eye gaze interaction emphasises user needs and future applications of eye tracking technology. Robust gaze interaction methods have been available for some years, with substantial amounts of applications to support communication, learning and entertainment already being used. However, there are still some uncertainties about this new technology among communication specialists and funding institutions. The 5th COGAIN conference will focus on spreading the experiences of people using gaze interaction in their daily life to potential users and specialists who have yet to benefit from it. Case studies from researchers and manufacturers working on new ways of making gaze interaction available for all, as well as integrating eye gaze with other forms of communication technology are also particularly welcome. We also encourage papers and posters which reach beyond the special case of eye control for people with disabilities into mainstream human-computer interaction development, for instance using eye tracking technology to enhance gaming experience and strategic play."

Themes:

  • Gaze-based access to computer applications
  • Gaze and environmental control
  • Gaze and personal mobility control
  • User experience studies
  • Innovations in eyetracking systems
  • Low cost gaze tracking systems
  • Attentive interfaces and inferring user intent from gaze
  • Gaze-based interaction with virtual worlds
  • Gaze and creativity
  • Gaming using gaze as an input modality
  • Gaze interaction with wearable displays
  • Using gaze with other modalities including BCI

"Papers which deal with the use of eye gaze to study the usability of mainstream applications and websites are not normally considered for inclusion in the conference". For more information see the COGAIN 2009 Call for Papers

Important dates:

Paper submission, 28th February. Notification on acceptance, 15th April. The conference will be held on the 26th of May at the Danish Technical University in connection with the Visionday event.

Friday, January 30, 2009

SWAET 2009 Annouced

The Scandinavian Workshop of Applied Eye-Tracking aims at being a meeting place for graduate students, researchers and others using eye-tracking as a measuring tool. It will be held at the University of Stavanger (May 6- 7th). Keynote speakers at SWAET 2009 are Dr Benjamin Tatler (University of Dundee) and Prof Jukka Hyönä (University of Turku).

Suggested topics for workshop presentations:
  • Reading in various contexts
  • Psycholinguistics
  • Integration of pictures and language
  • Face-to-face interaction and other social contexts
  • Attention (such as top-down/bottom-up factors)
  • Controlling interfaces with eye-tracking
  • Viewer behaviour towards images and video
  • Vehicle and traffic research
  • Human factors; such as air traffic control, ship navigation and pilots
  • Evaluation of user interfaces
  • Cognitive processes such as navigation, planning, problem solving, mental imagery, memory etc.
If you wish to present your research, you have to submit an abstract no later than March 15th 2009. Decisions on acceptance are given on April 1st.

Registration at the conference is € 50 for all delegates except graduate and undergraduate students, who participate free of charge. After April 10th, expect to pay € 80 (students € 30).

Wednesday, January 21, 2009

Wearable EOG Goggles: Eye-Based Interaction in Everyday Environments

Andreas Bulling in the Wearable Computing Group at the Swiss Federal Insitute of Technology (ETH) is working on a new Electrooculography-based eye tracking system. This technology relies on the small but measurable electrical currents (potentials) created by the eye musculature. A set of electrodes are attached to the skin and after signal processing this data can be used for controlling computer interfaces or other devices. The obvious advantage of this method of eye tracking compared to the more traditional corneal reflection video-based methods is that its not sensitive to sunlight and may therefor be used outdoors. However, to my knowledge, it provide a lower accuracy, this results in most EOG interfaces relying on eye gestures rather than gaze fixations.

"We want to introduce the paradigm of visual perception and investigations on eye movements as new methods to implement novel and complement current context-aware systems. Therefore, we will investigate the potential but also possible limitations of using eye movements to perform context and activity recognition in wearable settings. Besides recognizing individual activities another focus will be put on long-term eye movement analysis." More information.

Recently Andreas got a paper accepted for the CHI 2009 conference in Boston (April 4-9th) where the system will be demonstrated during the interactivity session. Andreas and the team at ETH are planning to investigate attentive user interfaces (AUI) in mobile settings using wearable systems, such as the prototype demonstrated in the video below.

View on YouTube

SMI gets the International Forum Design Award

Congratulations to the guys at SensoMotoric Instruments (SMI) for winning the International Forum 2009 Product Design Award with their iView X™ RED eye tracker.

"The unobtrusive yet elegant design for the stand-alone as well as for the monitor-attached configuration of the eye tracking system convinced the jury. "

The award will be presented at the first day of CeBIT (3rd of March) in Hanover. The system will also be on display for those of you who are attending CeBIT. More information on the International Forum Award.

Saturday, January 3, 2009

The Argentinian Eye Mouse software released (Amaro & Ponieman)

Nicolás Amaro and Nicolás Ponieman at the ORT Argentina, recently got the Chamber of Industry and Trade Argentine-German Award for Innovation 2008 for their work on a low-cost (webcam) headmounted corneal reflection based solution. Best of all the software can be downloaded which will directly benefit those who are in need but cannot afford the state-of-the-art systems currently on the market. As demonstrated by the video below it is capable of running grid-based interfaces, thus it should be adequate for GazeTalk and similar.

View on YouTube

Friday, January 2, 2009

An Unobtrusive Method for Gaze Tracking (N. Chitrik & Y. Schwartzburg)

Nava Chitrik and Yuliy Schwartzburg have in partial fulfillment of their Senior Design Project Requirements constructed a low-cost approach for remote eye tracking at the Cooper Union for the Advancement of Science and Art, Electrical Engineering Department.

"The line of a person's gaze is known to have many important applications in artificial intelligence (AI) and video conferencing but determining where a user is looking is still a very challenging problem. Traditionally, gaze trackers have been implemented with devices worn around the user's head, but more recent advances in the field use unobtrusive methods, i.e. an external video camera, to obtain information about where a person is looking. We have developed a simplified gaze tracking system using a single camera and a single point source mounted compactly in the view of the user, a large simplification over previous methods which have used a plurality of each. Furthermore, our algorithms are robust enough to allow head motion and our image processing functions are designed to extract data even from low-resolution or noisy video streams. Our system also has the computational advantage of working with very small image sizes, reducing the amount of resources needed for gaze tracking, freeing them up for applications that might utilize this information.

To reiterate: The main differences between this implementation and similar implementations are that this system uses a histogram method as opposed to edge detection to work with very low resolution video extremely quickly. However, it requires an infrared camera and infrared LED's. (Which can be purchased for less than 25 dollars online.)"

View on YouTube