Thursday, October 8, 2009

DoCoMo EOG update

While eye movement detection using EOG is nothing new the latest demonstration by Japanese NTT DoCoMo illustrates recent developments in the field. The innovation here is the form factor which is quite impressive. Typically EOG is detected using electrodes placed around the eyes as in Andreas Bullings prototype demonstrated at CHI 09 in Boston. Now it can be done using tiny sensors inside the ear. Just compare it to the prototype demonstrated last year!







Thanks Roman for the links!

Monday, September 28, 2009

Wearable Augmented Reality System using Gaze Interaction (Park, Lee & Choi)

Came across this paper on a wearable system that employs a small eye tracker and a head mounted display for augmented reality. I've previously posted a video on the same system. It's a future technology with great potential, only imagination sets the limit here. There is a lot of progress in image/object recognition and location awareness taking place right now (with all the associated non-trivial problems to solve!)


Abstract
"Undisturbed interaction is essential to provide immersive AR environments. There have been a lot of approaches to interact with VEs (virtual environments) so far, especially in hand metaphor. When the user‟s hands are being used for hand-based work such as maintenance and repair, necessity of alternative interaction technique has arisen. In recent research, hands-free gaze information is adopted to AR to perform original actions in concurrence with interaction. [3, 4]. There has been little progress on that research, still at a pilot study in a laboratory setting. In this paper, we introduce such a simple WARS(wearable augmented reality system) equipped with an HMD, scene camera, eye tracker. We propose „Aging‟ technique improving traditional dwell-time selection, demonstrate AR gallery – dynamic exhibition space with wearable system."
  • Park, H. M., Seok Han Lee, and Jong Soo Choi 2008. Wearable augmented reality system using gaze interaction. In Proceedings of the 2008 7th IEEE/ACM international Symposium on Mixed and Augmented Reality - Volume 00 (September 15 - 18, 2008). Symposium on Mixed and Augmented Reality. IEEE Computer Society, Washington, DC, 175-176. DOI= http://dx.doi.org/10.1109/ISMAR.2008.4637353

Friday, September 18, 2009

The EyeWriter project

For some time I've been following the EyeWriter project which aims at enabling Tony, who has ALS, to draw graffiti using eye gaze alone. The open source eye tracker is available at Google code and is based on C++, OpenFrameworks and OpenCV. The current version supports basic pupil tracking based on image thresholding and blob detection but they are aiming for remote tracking using IR glints. Keep up the great work guys!

The Eyewriter from Evan Roth on Vimeo.

eyewriter tracking software walkthrough from thesystemis on Vimeo.

More information is found at http://fffff.at/eyewriter/

Monday, September 14, 2009

GaZIR: Gaze-based Zooming Interface for Image Retrieval (Kozma L., Klami A., Kaski S., 2009)

From the Helsinki Institute for Information Technology, Finland, comes a research prototype called GaZIR for gaze based image retrieval built by Laszlo Kozma, Arto Klami and Samuel Kaski. The GaZIR prototype uses a light-weight logistic regression model as a mechanism for predicting relevance based on eye movement data (such as viewing time, revisit counts, fixation length etc.) All occurring on-line in real time. The system is build around the PicSOM (paper) retrieval engine which is based on tree structured self-organizing maps (TS-SOMs). When provided a set of reference images the PicSOM engine goes online to download a set of similar images (based on color, texture or shape)

Abstract
"We introduce GaZIR, a gaze-based interface for browsing and searching for images. The system computes on-line predictions of relevance of images based on implicit feedback, and when the user zooms in, the images predicted to be the most relevant are brought out. The key novelty is that the relevance feedback is inferred from implicit cues obtained in real-time from the gaze pattern, using an estimator learned during a separate training phase. The natural zooming interface can be connected to any content-based information retrieval engine operating on user feedback. We show with experiments on one engine that there is sufficient amount of information in the gaze patterns to make the estimated relevance feedback a viable choice to complement or even replace explicit feedback by pointing-and-clicking."


Fig1. "Screenshot of the GaZIR interface. Relevance feedback gathered from outer rings influences the images retrieved for the inner rings, and the user can zoom in to reveal more rings."

Fig2. "Precision-recall and ROC curves for userindependent relevance prediction model. The predictions (solid line) are clearly above the baseline of random ranking (dash-dotted line), showing that relevance of images can be predicted from eye movements. The retrieval accuracy is also above the baseline provided by a naive model making a binary relevance judgement based on whether the image was viewed or not (dashed line), demonstrating the gain from more advanced gaze modeling."

Fig 3. "Retrieval performance in real user experiments. The bars indicate the proportion of relevant images shown during the search in six different search tasks for three different feedback methods. Explicit denotes the standard point-and-click feedback, predicted means implicit feedback inferred from gaze, and random is the baseline of providing random feedback. In all cases both actual feedback types outperform the baseline, but the relative performance of explicit and implicit feedback depends on the search task."
  • László Kozma, Arto Klami, and Samuel Kaski: GaZIR: Gaze-based Zooming Interface for Image Retrieval. To appear in Proceedings of 11th Conference on Multimodal Interfaces and The Sixth Workshop on Machine Learning for Multimodal Interaction (ICMI-MLMI), Boston, MA, USA, Novermber 2-6, 2009. (abstract, pdf)

Friday, September 11, 2009

An Adaptive Algorithm for Fixation, Saccade, and Glissade Detection in Eye-Tracking Data (Nyström M. & Holmqvist K, 2009)

From Markus Nyström and Kenneth Holmqvist at the Lund University Humanities Lab (HumLab) in Sweden comes an interesting paper on a novel algorithm that is capable of detecting glissades (aka dynamic overshoot) in eye tracker data. These are wobbling eye movements often found at the end of saccades and has previously been considered errors in saccadic programming with limited value. What ever their function is the phenomena does exists and should be accounted for. The paper reports finding glissades following half of all saccades while reading or viewing scenes, and has an average duration of 24 ms. This is work is important as it extends the default categorization of eye movement e.g. fixation, saccade, smooth pursuit, and blink. The algorithm is based on velocity saccade detection and is driven by data while containing a limited number of subjective settings. The algorithm contains a number of improvements such as thresholds for peak- and saccade onset/offset detection, adaptive threshold adjustment based on local noise levels, physical constraints on eye-movements to exclude noise and jitter, and new recommendations for minimum allowed fixation and saccade duration. Also, important to note that the data was obtained using a high-speed 1250 Hz SMI system, how the algorithm performs on a typical remote tracker running at 50-250Hz has yet to be defined.

Wednesday, September 9, 2009

Psychnology Journal: Gaze control for work and play

"PsychNology Journal (ISSN 1720-7525) is a quadrimestral, international, peer-reviewed journal on the relationship between humans and technology. The name 'PsychNology' emphasizes its multidisciplinary interest in all issues related to the human adoption and development of technologies. Its broad scope allows to host in a sole venue advances and ideas that would otherwise remain confined within separate communities or disciplines. PNJ is an independent, electronic publication that leaves the copyright to authors, and provides wide accessibility to their papers through the Internet and several indexing and abstracting services including PsycInfo and EBSCO."

The Psychnology Journal Special edition on Gaze control for work and play is now available online. It contains some of the highlights from the Cogain conference last year in an extended journal format. For the COGAIN people this is old news, for the rest it's hopefully interesting stuff. The NeoVisus prototype I presented in Prague should have appeared but unfortunately did not have the time to make the necessary changes. More information on the scrollable keyboard and text entry by gaze in general is available in Päivi's excellent Ph.D thesis. Also, rumor has it that Javier San Agustin's Ph.D thesis gaze interaction and a low-cost alternative is getting closer to D-day. We're all looking forward to it, hang in there mate =)

Thursday, August 20, 2009

A geometric approach to remote eye tracking (Villanueva et al, 2009)

Came across this paper today, it's good news and a great achievement, especially since consumer products for recording high definition over a plain USB port has begun to appear. For example the upcoming Microsoft Lifecam Cinema HD provides 1,280 x 720 at 30 frames per second. This is to be released on September 9th at a reasonable US$ 80. Hopefully it will allow a simple modification to remove the infrared blocking filter. Things are looking better and better for low-cost eye tracking, keep up the excellent work, it will make a huge difference for all of us.

Abstract
"This paper presents a principled analysis of various combinations of image features to determine their suitability for remote eye tracking. It begins by reviewing the basic theory underlying the connection between eye image and gaze direction. Then a set of approaches is proposed based on different combinations of well-known features and their behaviour is valuated, taking into account various additional criteria such as free head movement, and minimum hardware and calibration requirements. The paper proposes a final method based on multiple glints and the pupil centre; the method is evaluated experimentally. Future trends in eye tracking technology are also discussed."


The algorithms were implemented in C++ running on a Windows PC equipped with a Pentium 4 processor at 3 GHz and 1 GB of Ram. The camera of choice delivers 15 frames per second at 1280 x 1024. Optimal distance from screen is 60 cm which is rather typical for remote eye trackers. This provides a track-box volume of 20 x 20 x 20 cm. Within this area the algorithms produce an average accuracy of 1.57 degrees. A 1 degree accuracy may be achieved obtained if the head is the same position as it was during calibration. Moving the head parallel to the monitor plane increases error by 0.2 - 0.4 deg. while moving closer or further away introduces a larger error between 1-1.5 degrees (mainly due to camera focus range). Note that no temporal filtering was used in the reporting. All-in-all these results are not so far from what typical remote systems produce.


The limitation of 15 fps stems from the frame rate of the camera, the software itself is able to process +50 images per second on the specified machine. Leaving it to our imagination what frame rates may be achieved with a fast Intel Core i7 processor with four cores.


  • A. Villanueva, G. Daunys, D. Hansen, M. Böhme, R. Cabeza, A. Meyer, and E. Barth, "A geometric approach to remote eye tracking," Universal Access in the Information Society. [Online]. Available: http://dx.doi.org/10.1007/s10209-009-0149-0

Tuesday, August 18, 2009

COGAIN Student Competition Results

Lasse Farnung Laursen, a Ph.D student with the Department of Informatics and Mathematical Modeling at the Technical University of Denmark, won this years COGAIN student competition with the leisure application called GazeTrain.

"GazeTrain (illustrated in the screenshot below) is an action oriented puzzle game, that can be controlled by eye movements. In GazeTrain you must guide a train by placing track tiles in front of it. As you guide the train, you must collect various cargo and drop them off at the nearest city thereby earning money. For further details regarding how to play the game, we encourage you to read the tutorial accessible from the main menu. The game is quite customizable as the dwell time and several other parameters can be adjusted to best suit your play-style." (Source)

The GazeTrain game.

Runner ups, sharing the second place were

Music Editor, developed by Ainhoa Yera Gil, Public University of Navarre, Spain. Music Editor is a gaze-operated application that allows the user to compose, edit and play music by eye movements. The reviewers appreciated it that "a user can not only play but can actually create something" and that "Music Editor is well suited for gaze control".

Gaze Based Sudoku, developed by Juha Hjelm and Mari Pesonen, University of Tampere, Finland. The game can be operated by eye movements and it has three difficulty levels. Reviewers especially appreciated how "the separation between viewing and controlling and between sudoku grid and number selection panel is solved" and that the game "has no time constraints" so it is "relaxing" to play.

Tuesday, August 11, 2009

ALS Society of British Columbia announces Engineering Design Awards (Canadian students only)

"The ALS Society of British Columbia has established three Awards to encourage and recognize innovation in technology to substantially improve the quality of life of people living with ALS (Amyotrophic Lateral Sclerosis, also known as Lou Gehrig’s Disease). Students at the undergraduate or graduate level in engineering or a related discipline at a post-secondary institution in British Columbia or elsewhere in Canada are eligible for the Awards. Students may be considered individually or as a team. Mentor Awards may also be given to faculty supervising students who win awards" (see Announcement)


Project ideas:
  • Low-cost eye tracker
    • Issue: Current commercial eye-gaze tracking systems cost thousands to tens of thousands of dollars. The high cost of eye-gaze trackers prevents potential users from accessing eye- gaze tracking tools. The hardware components required for eye-gaze tracking do not justify the price and a lower-cost alternative is desirable. Webcams may be used for low-cost imaging, along with simple infrared diodes for system lighting. Alternatively, visible light systems may also be investigated. Opensource eye-gaze tracking software is also available. (ed: ITU GazeTracker, OpenEyes, Track Eye, OpenGazer and MyEye (free, no source)
    • Goal: The goal of this design project is to develop a low-cost and usable eye-gaze tracking system based on simple commercial-of-the-shelf hardware.
    • Deliverables: A working prototype of a functional, low-cost (< $200), eye-gaze tracking system.
  • Eye-glasses compensation
    • Deliverables: A working prototype of a functional, low-cost (< $200), eye-gaze tracking system
    • Issue: The use of eye-glasses can cause considerable problems in eye-gaze tracking. The issue stems from reflections off the eye-glasses due to the use of controlled infrared lighting (on and off axis light sources) used to highlight features of the face. The key features of interest are the pupils and glints (or reflections of the surface of the cornea). Incorrectly identifying the pupils and glints then results in invalid estimation of the point-of-gaze.
    • Goal: The goal of this design project is to develop techniques for either: 1) avoiding image corruption with eye-glasses on a commercial eye-gaze tracker, or 2) developing a controlled lighting scheme to ensure valid pupil and glints identification are identified in the presence of eye-glasses.
    • Deliverables: Two forms of deliverables are possible: 1) A working prototype illustrating functional eye-gaze tracking in the presence of eye-glasses with a commercial eye-gaze tracker, or 2) A working prototype illustrating accurate real-time identification of the pupil and glints using controlled infrared lighting (on and off axis light sources) in the presence of eye-glasses.
  • Innovative selection with ALS and eye gaze
    • Issue: As mobility steadily decreases in the more advanced stages of ALS, alternative techniques for selection are required. Current solutions include head switches, sip and puff switches and dwell time activation depending on the degree of mobility loss to name a few. The use of dwell time requires no mobility other than eye motion, however, this technique suffers from ‘lag’ in that the user must wait the dwell time duration for each selection, as well as the ‘midas touch’ problem in which unintended selection if the gaze point is stationary for too long.
    • Goal: The goal of this design project is to develop a technique for improved selection with eye-gaze for individuals with only eye-motion available. Possible solutions may involve novel HCI designs for interaction, including various adaptive and predictive technologies, the consideration of contextual cues, and the introduction of ancillary inputs, such as EMG, EEG.
    • Deliverables: A working prototype illustrating eye-motion only selection with a commercial eye-gaze tracking system.
  • Novel and valuable eye-gaze tracking applications and application enhancements
    • Issue: To date, relatively few gaze-tracking applications have been developed. These include relatively simplistic applications such as the tedious typing of words, and even in such systems, little is done to ease the effort required, e.g., systems typically do not allow for the saving and reuse of words and sentences.
    • Goal: The goal of this design project is to develop one or more novel applications or application enhancements that take gaze as input, and that provide new efficiencies or capabilities that could significantly improve the quality of life of those living with ALS.
    • Deliverables: A working prototype illustrating one or more novel applications that take eye-motion as an input. The prototype must be developed and implemented to the extent that an evaluation of the potential efficiencies and/or reductions in effort can be evaluated by persons living with ALS and others on an evaluation panel.

    See the Project Ideas for more information. For contact information see page two of the announcement.

Lund Eye-Tracking Academy (LETA)

Kenneth Holmqvist and his team at the Humanities Lab at Lund University, Sweden will host another three day long LETA training course in eye tracking and analysis of eye movement data. This is an excellent opportunity to get hands-on experience using state-of-the art equipment and setting up experiments. The course is held between 23rd-25th September and the registration is open.

Course contents
• Pro and cons of headmounted, remote and contact eye-trackers.
• High sampling speed and detailed precision – who needs it?
• Gaze-overlaid videos vs datafiles – what can you do with them?
• How to set up and calibrate on a variety of subjects on different eye-trackers?
• Glasses, lenses, mascara, and drooping eye-lids – what to do?
• How to work with stimulus programs, and synchronize them with eye-tracking recording?
• How to deal with the consent forms and ethical issues?
• Short introduction to experimental design: Potentials and pitfalls.
• Visualisation of data vs number crunching.
• Fast data analysis of multi-user experiments.
• Fixation durations, saccadic amplitudes, transition diagrams, group similarity measures, and all the other measures – what do they tell us? What are the pitfalls?

Teaching methods
Lectures on selected topics (8h)
Hands-on work in our lab on prespecified experiments: Receiving and recording on a subject (9h). Handson initial data analysis (3h).

Eye-tracking systems available for this training
2*SMI HED 50 Hz with Polhemus Head-tracking
3*SMI HiSpeed 240/1250 Hz
1*SMI RED-X remote 50 Hz
2*SMI HED-mobile 50/200 Hz