Tuesday, August 11, 2009

ALS Society of British Columbia announces Engineering Design Awards (Canadian students only)

"The ALS Society of British Columbia has established three Awards to encourage and recognize innovation in technology to substantially improve the quality of life of people living with ALS (Amyotrophic Lateral Sclerosis, also known as Lou Gehrig’s Disease). Students at the undergraduate or graduate level in engineering or a related discipline at a post-secondary institution in British Columbia or elsewhere in Canada are eligible for the Awards. Students may be considered individually or as a team. Mentor Awards may also be given to faculty supervising students who win awards" (see Announcement)


Project ideas:
  • Low-cost eye tracker
    • Issue: Current commercial eye-gaze tracking systems cost thousands to tens of thousands of dollars. The high cost of eye-gaze trackers prevents potential users from accessing eye- gaze tracking tools. The hardware components required for eye-gaze tracking do not justify the price and a lower-cost alternative is desirable. Webcams may be used for low-cost imaging, along with simple infrared diodes for system lighting. Alternatively, visible light systems may also be investigated. Opensource eye-gaze tracking software is also available. (ed: ITU GazeTracker, OpenEyes, Track Eye, OpenGazer and MyEye (free, no source)
    • Goal: The goal of this design project is to develop a low-cost and usable eye-gaze tracking system based on simple commercial-of-the-shelf hardware.
    • Deliverables: A working prototype of a functional, low-cost (< $200), eye-gaze tracking system.
  • Eye-glasses compensation
    • Deliverables: A working prototype of a functional, low-cost (< $200), eye-gaze tracking system
    • Issue: The use of eye-glasses can cause considerable problems in eye-gaze tracking. The issue stems from reflections off the eye-glasses due to the use of controlled infrared lighting (on and off axis light sources) used to highlight features of the face. The key features of interest are the pupils and glints (or reflections of the surface of the cornea). Incorrectly identifying the pupils and glints then results in invalid estimation of the point-of-gaze.
    • Goal: The goal of this design project is to develop techniques for either: 1) avoiding image corruption with eye-glasses on a commercial eye-gaze tracker, or 2) developing a controlled lighting scheme to ensure valid pupil and glints identification are identified in the presence of eye-glasses.
    • Deliverables: Two forms of deliverables are possible: 1) A working prototype illustrating functional eye-gaze tracking in the presence of eye-glasses with a commercial eye-gaze tracker, or 2) A working prototype illustrating accurate real-time identification of the pupil and glints using controlled infrared lighting (on and off axis light sources) in the presence of eye-glasses.
  • Innovative selection with ALS and eye gaze
    • Issue: As mobility steadily decreases in the more advanced stages of ALS, alternative techniques for selection are required. Current solutions include head switches, sip and puff switches and dwell time activation depending on the degree of mobility loss to name a few. The use of dwell time requires no mobility other than eye motion, however, this technique suffers from ‘lag’ in that the user must wait the dwell time duration for each selection, as well as the ‘midas touch’ problem in which unintended selection if the gaze point is stationary for too long.
    • Goal: The goal of this design project is to develop a technique for improved selection with eye-gaze for individuals with only eye-motion available. Possible solutions may involve novel HCI designs for interaction, including various adaptive and predictive technologies, the consideration of contextual cues, and the introduction of ancillary inputs, such as EMG, EEG.
    • Deliverables: A working prototype illustrating eye-motion only selection with a commercial eye-gaze tracking system.
  • Novel and valuable eye-gaze tracking applications and application enhancements
    • Issue: To date, relatively few gaze-tracking applications have been developed. These include relatively simplistic applications such as the tedious typing of words, and even in such systems, little is done to ease the effort required, e.g., systems typically do not allow for the saving and reuse of words and sentences.
    • Goal: The goal of this design project is to develop one or more novel applications or application enhancements that take gaze as input, and that provide new efficiencies or capabilities that could significantly improve the quality of life of those living with ALS.
    • Deliverables: A working prototype illustrating one or more novel applications that take eye-motion as an input. The prototype must be developed and implemented to the extent that an evaluation of the potential efficiencies and/or reductions in effort can be evaluated by persons living with ALS and others on an evaluation panel.

    See the Project Ideas for more information. For contact information see page two of the announcement.

Lund Eye-Tracking Academy (LETA)

Kenneth Holmqvist and his team at the Humanities Lab at Lund University, Sweden will host another three day long LETA training course in eye tracking and analysis of eye movement data. This is an excellent opportunity to get hands-on experience using state-of-the art equipment and setting up experiments. The course is held between 23rd-25th September and the registration is open.

Course contents
• Pro and cons of headmounted, remote and contact eye-trackers.
• High sampling speed and detailed precision – who needs it?
• Gaze-overlaid videos vs datafiles – what can you do with them?
• How to set up and calibrate on a variety of subjects on different eye-trackers?
• Glasses, lenses, mascara, and drooping eye-lids – what to do?
• How to work with stimulus programs, and synchronize them with eye-tracking recording?
• How to deal with the consent forms and ethical issues?
• Short introduction to experimental design: Potentials and pitfalls.
• Visualisation of data vs number crunching.
• Fast data analysis of multi-user experiments.
• Fixation durations, saccadic amplitudes, transition diagrams, group similarity measures, and all the other measures – what do they tell us? What are the pitfalls?

Teaching methods
Lectures on selected topics (8h)
Hands-on work in our lab on prespecified experiments: Receiving and recording on a subject (9h). Handson initial data analysis (3h).

Eye-tracking systems available for this training
2*SMI HED 50 Hz with Polhemus Head-tracking
3*SMI HiSpeed 240/1250 Hz
1*SMI RED-X remote 50 Hz
2*SMI HED-mobile 50/200 Hz

Thursday, August 6, 2009

Päivi Majaranta PhD Thesis on Text Entry by Eye Gaze

The most complete publication on gaze typing is now available as Päivi Majaranta at the University of Tampere have successfully defended her PhD thesis. It summarizes previous work and discusses/exemplifies important topics such as word prediction, layout, feedback and user aspects. The material is presented in a straight forward manner with a clear structure and excellent illustrations. It will without doubt be useful for anyone who is about to design and develop a gaze based text entry interface. Congratulations Päivi for such an well written thesis.



Friday, July 31, 2009

SMI RED 250!


Today SMI announced the new RED250 which, as the name suggests, has an impressive 250Hz sampling rate. It has an accuracy of 0.5 degrees or below (typ.), less than 10 ms. latency and operates within 60-80 cm head distance. The track-box is 40x40 cm at 70cm distance and will recover tracking faster than the previous model. No details on pricing yet but top of the line performance never comes cheap. Get the flyer as pdf.

Survey on gaze visualization in 3D virtual environments

Got an email today from Sophie Stellmach a PhD student from the User Interface & Software Engineering group at the Otto-von-Guericke University in Germany. She has posted an online survey and would like your some input from eye tracking specialists on 3D gaze visualization.

"In the course of my work I have developed several gaze visualizations for facilitating eye tracking studies in (static) three-dimensional virtual environments. In order to evaluate the potential utility of these techniques, I am conducting an online survey with eye tracking researchers and professionals. I would like to invite you to this survey as I think that your answers are highly valuable for this investigation. The survey should take less than 10 minutes of your time! Your answers will be stored anonymously. You can access the survey under the following link: http://gamescience.bth.se/survey/index.php?sid=27319=en "

Wednesday, July 22, 2009

Telegaze update

Remember the TeleGaze robot developed by Hemin Omer which I wrote about last September? Today there is a new video available showing an updated interface which appears to be somewhat improved, no further information is available.
Update: The new version includes an automatic "person-following" mode which can be turned on or off through the interface. See video below

Gaze Interaction in Immersive Virtual Reality - 3D Eye Tracking in Virtual Worlds

Thies Pfeiffer (blog) working in the A.I group at the Faculty of technology, Bielefeld University in Germany have presented some interesting research on 3D gaze interaction in virtual environments. As the video demonstrates they have achieved high accuracy for gaze based pointing and selection. This opens up for a wide range of interesting man-machine interaction where digital avatars may mimic natural human behavior. Impressive.



Publications
  • Pfeiffer, T. (2008). Towards Gaze Interaction in Immersive Virtual Reality: Evaluation of a Monocular Eye Tracking Set-Up. In Virtuelle und Erweiterte Realität - Fünfter Workshop der GI-Fachgruppe VR/AR, 81-92. Aachen: Shaker Verlag GmbH. [Abstract] [BibTeX] [PDF]
  • Pfeiffer, T., Latoschik, M.E. & Wachsmuth, I. (2008). Evaluation of Binocular Eye Trackers and Algorithms for 3D Gaze Interaction in Virtual Reality Environments. Journal of Virtual Reality and Broadcasting, 5 (16), dec. [Abstract] [BibTeX] [URL] [PDF]
  • Pfeiffer, T., Donner, M., Latoschik, M.E. & Wachsmuth, I. (2007). 3D fixations in real and virtual scenarios. Journal of Eye Movement Research, Special issue: Abstracts of the ECEM 2007, 13.
  • Pfeiffer, T., Donner, M., Latoschik, M.E. & Wachsmuth, I. (2007). Blickfixationstiefe in stereoskopischen VR-Umgebungen: Eine vergleichende Studie. In Vierter Workshop Virtuelle und Erweiterte Realität der GI-Fachgruppe VR/AR, 113-124. Aachen: Shaker. [Abstract] [BibTeX] [PDF]
List of all publications available here.

Wednesday, July 15, 2009

Gaze & Voice recognition game development blog

Jonathan O'Donovan, a masters student in Interactive Entertainment Technology at the Trinity College in Dublin, have recently started a blog for his thesis. It will combine gaze and voice recognition for developing a new video game. So far the few posts available have mainly concerned the underlying framework but a proof-of-concept combining gaze and voice is demonstrated. The project will be developed on a Microsoft Windows based platform and utilizes the XNA game development framework for graphics and the Microsoft Speech SDK for voice input. The eye tracker of choice is a Tobii T60 provided by Acuity ETS (Reading, UK). The thesis will be supervised by Veronica Sundstedt at the Trinity College Computer Science dept.
Keep us posten Jonathan, excitied to see what you'll come up with!





Update: 
The project resulted in the Rabbit Run game which is documented in the following publication:

  • J. O’Donovan, J. Ward, S. Hodgins, V. Sundstedt (2009) Rabbit Run: Gaze and Voice Based Game Interaction (PDF). 

Monday, July 13, 2009

Oculis labs Chameleon prevents over shoulder reading

"Two years ago computer security expert Bill Anderson read about scientific research on how the human eye moves as it reads and processes text and images. 'This obscure characteristic... suddenly struck me as (a solution to) a security problem,' says Anderson. With the help of a couple of software developers, Anderson developed a software program called Chameleon that tracks a viewer's gaze patterns and only allows an authorized user to read text on the screen, while everyone else sees gibberish. Chameleon uses gaze-tracking software and camera equipment to track an authorized reader's eyes to show only that one person the correct text. After a 15-second calibration period in which the software learns the viewer's gaze patterns, anyone looking over that user's shoulder sees dummy text that randomly and constantly changes. To tap the broader consumer market, Anderson built a more consumer-friendly version called PrivateEye, which can work with a simple Webcam to blur a user's monitor when he or she turns away. It also detects other faces in the background, and a small video screen pops up to alert the user that someone is looking at the screen. 'There have been inventions in the space of gaze-tracking. There have been inventions in the space of security,' says Anderson. 'But nobody has put the two ideas together, as far as we know.'" (source)

Patent application
Article by Baltimore Sun

Monday, June 29, 2009

Video from COGAIN2009

John Paulin Hansen has posted a video showing some highlights from the annual COGAIN conference. It demonstrates three available gaze interaction solutions, the COGAIN GazeTalk interface, Tobii Technologies MyTobii and Alea Technologies IG-30. These interfaces relies on dwell-activated on-screen keyboards ( i.e. same procedure as last year).