- László Kozma, Arto Klami, and Samuel Kaski: GaZIR: Gaze-based Zooming Interface for Image Retrieval. To appear in Proceedings of 11th Conference on Multimodal Interfaces and The Sixth Workshop on Machine Learning for Multimodal Interaction (ICMI-MLMI), Boston, MA, USA, Novermber 2-6, 2009. (abstract, pdf)
Monday, September 14, 2009
GaZIR: Gaze-based Zooming Interface for Image Retrieval (Kozma L., Klami A., Kaski S., 2009)
From the Helsinki Institute for Information Technology, Finland, comes a research prototype called GaZIR for gaze based image retrieval built by Laszlo Kozma, Arto Klami and Samuel Kaski. The GaZIR prototype uses a light-weight logistic regression model as a mechanism for predicting relevance based on eye movement data (such as viewing time, revisit counts, fixation length etc.) All occurring on-line in real time. The system is build around the PicSOM (paper) retrieval engine which is based on tree structured self-organizing maps (TS-SOMs). When provided a set of reference images the PicSOM engine goes online to download a set of similar images (based on color, texture or shape)
Fig1. "Screenshot of the GaZIR interface. Relevance feedback gathered from outer rings influences the images retrieved for the inner rings, and the user can zoom in to reveal more rings."
Fig2. "Precision-recall and ROC curves for userindependent relevance prediction model. The predictions (solid line) are clearly above the baseline of random ranking (dash-dotted line), showing that relevance of images can be predicted from eye movements. The retrieval accuracy is also above the baseline provided by a naive model making a binary relevance judgement based on whether the image was viewed or not (dashed line), demonstrating the gain from more advanced gaze modeling."
Fig 3. "Retrieval performance in real user experiments. The bars indicate the proportion of relevant images shown during the search in six different search tasks for three different feedback methods. Explicit denotes the standard point-and-click feedback, predicted means implicit feedback inferred from gaze, and random is the baseline of providing random feedback. In all cases both actual feedback types outperform the baseline, but the relative performance of explicit and implicit feedback depends on the search task."
Friday, September 11, 2009
An Adaptive Algorithm for Fixation, Saccade, and Glissade Detection in Eye-Tracking Data (Nyström M. & Holmqvist K, 2009)
From Markus Nyström and Kenneth Holmqvist at the Lund University Humanities Lab (HumLab) in Sweden comes an interesting paper on a novel algorithm that is capable of detecting glissades (aka dynamic overshoot) in eye tracker data. These are wobbling eye movements often found at the end of saccades and has previously been considered errors in saccadic programming with limited value. What ever their function is the phenomena does exists and should be accounted for. The paper reports finding glissades following half of all saccades while reading or viewing scenes, and has an average duration of 24 ms. This is work is important as it extends the default categorization of eye movement e.g. fixation, saccade, smooth pursuit, and blink. The algorithm is based on velocity saccade detection and is driven by data while containing a limited number of subjective settings. The algorithm contains a number of improvements such as thresholds for peak- and saccade onset/offset detection, adaptive threshold adjustment based on local noise levels, physical constraints on eye-movements to exclude noise and jitter, and new recommendations for minimum allowed fixation and saccade duration. Also, important to note that the data was obtained using a high-speed 1250 Hz SMI system, how the algorithm performs on a typical remote tracker running at 50-250Hz has yet to be defined.
- Nyström, M. & Holmqvist, K., "An Adaptive Algorithm for Fixation, Saccade, and Glissade Detection in Eye-Tracking Data", Behavior Research Methods.
- Download Matlab source code for the algorithm (version 1.0)
7
comments
Labels:
algorithm,
HumLab,
noisy gaze data
Wednesday, September 9, 2009
Psychnology Journal: Gaze control for work and play
"PsychNology Journal (ISSN 1720-7525) is a quadrimestral, international, peer-reviewed journal on the relationship between humans and technology. The name 'PsychNology' emphasizes its multidisciplinary interest in all issues related to the human adoption and development of technologies. Its broad scope allows to host in a sole venue advances and ideas that would otherwise remain confined within separate communities or disciplines. PNJ is an independent, electronic publication that leaves the copyright to authors, and provides wide accessibility to their papers through the Internet and several indexing and abstracting services including PsycInfo and EBSCO."
The Psychnology Journal Special edition on Gaze control for work and play is now available online. It contains some of the highlights from the Cogain conference last year in an extended journal format. For the COGAIN people this is old news, for the rest it's hopefully interesting stuff. The NeoVisus prototype I presented in Prague should have appeared but unfortunately did not have the time to make the necessary changes. More information on the scrollable keyboard and text entry by gaze in general is available in Päivi's excellent Ph.D thesis. Also, rumor has it that Javier San Agustin's Ph.D thesis gaze interaction and a low-cost alternative is getting closer to D-day. We're all looking forward to it, hang in there mate =)
The Psychnology Journal Special edition on Gaze control for work and play is now available online. It contains some of the highlights from the Cogain conference last year in an extended journal format. For the COGAIN people this is old news, for the rest it's hopefully interesting stuff. The NeoVisus prototype I presented in Prague should have appeared but unfortunately did not have the time to make the necessary changes. More information on the scrollable keyboard and text entry by gaze in general is available in Päivi's excellent Ph.D thesis. Also, rumor has it that Javier San Agustin's Ph.D thesis gaze interaction and a low-cost alternative is getting closer to D-day. We're all looking forward to it, hang in there mate =)
- Predicting preference from fixations
Mackenzie G. Glaholt, Mei-Chun Wu, Eyal M. Reingold
- Scrollable Keyboards for Casual Eye Typing
Oleg Špakov, Päivi Majaranta
- Hands Free Interaction with Virtual Information in a Real Environment: Eye Gaze as an Interaction Tool in an Augmented Reality System
Susanna Nilsson, Torbjörn Gustafsson, Per Carleberg
- Gaze beats mouse: A case study on a gaze-controlled breakout
Michael Dorr, Laura Pomarjanschi, Erhardt Barth
- Evaluation of the Potential of Gaze Input for Game Interaction
Javier San Agustin, Julio C. Mateo, John Paulin Hansen, Arantxa Villanueva
Thursday, August 20, 2009
A geometric approach to remote eye tracking (Villanueva et al, 2009)
Came across this paper today, it's good news and a great achievement, especially since consumer products for recording high definition over a plain USB port has begun to appear. For example the upcoming Microsoft Lifecam Cinema HD provides 1,280 x 720 at 30 frames per second. This is to be released on September 9th at a reasonable US$ 80. Hopefully it will allow a simple modification to remove the infrared blocking filter. Things are looking better and better for low-cost eye tracking, keep up the excellent work, it will make a huge difference for all of us.
Abstract
"This paper presents a principled analysis of various combinations of image features to determine their suitability for remote eye tracking. It begins by reviewing the basic theory underlying the connection between eye image and gaze direction. Then a set of approaches is proposed based on different combinations of well-known features and their behaviour is valuated, taking into account various additional criteria such as free head movement, and minimum hardware and calibration requirements. The paper proposes a final method based on multiple glints and the pupil centre; the method is evaluated experimentally. Future trends in eye tracking technology are also discussed."
The algorithms were implemented in C++ running on a Windows PC equipped with a Pentium 4 processor at 3 GHz and 1 GB of Ram. The camera of choice delivers 15 frames per second at 1280 x 1024. Optimal distance from screen is 60 cm which is rather typical for remote eye trackers. This provides a track-box volume of 20 x 20 x 20 cm. Within this area the algorithms produce an average accuracy of 1.57 degrees. A 1 degree accuracy may be achieved obtained if the head is the same position as it was during calibration. Moving the head parallel to the monitor plane increases error by 0.2 - 0.4 deg. while moving closer or further away introduces a larger error between 1-1.5 degrees (mainly due to camera focus range). Note that no temporal filtering was used in the reporting. All-in-all these results are not so far from what typical remote systems produce.
The limitation of 15 fps stems from the frame rate of the camera, the software itself is able to process +50 images per second on the specified machine. Leaving it to our imagination what frame rates may be achieved with a fast Intel Core i7 processor with four cores.
Abstract
"This paper presents a principled analysis of various combinations of image features to determine their suitability for remote eye tracking. It begins by reviewing the basic theory underlying the connection between eye image and gaze direction. Then a set of approaches is proposed based on different combinations of well-known features and their behaviour is valuated, taking into account various additional criteria such as free head movement, and minimum hardware and calibration requirements. The paper proposes a final method based on multiple glints and the pupil centre; the method is evaluated experimentally. Future trends in eye tracking technology are also discussed."
The algorithms were implemented in C++ running on a Windows PC equipped with a Pentium 4 processor at 3 GHz and 1 GB of Ram. The camera of choice delivers 15 frames per second at 1280 x 1024. Optimal distance from screen is 60 cm which is rather typical for remote eye trackers. This provides a track-box volume of 20 x 20 x 20 cm. Within this area the algorithms produce an average accuracy of 1.57 degrees. A 1 degree accuracy may be achieved obtained if the head is the same position as it was during calibration. Moving the head parallel to the monitor plane increases error by 0.2 - 0.4 deg. while moving closer or further away introduces a larger error between 1-1.5 degrees (mainly due to camera focus range). Note that no temporal filtering was used in the reporting. All-in-all these results are not so far from what typical remote systems produce.
The limitation of 15 fps stems from the frame rate of the camera, the software itself is able to process +50 images per second on the specified machine. Leaving it to our imagination what frame rates may be achieved with a fast Intel Core i7 processor with four cores.
- A. Villanueva, G. Daunys, D. Hansen, M. Böhme, R. Cabeza, A. Meyer, and E. Barth, "A geometric approach to remote eye tracking," Universal Access in the Information Society. [Online]. Available: http://dx.doi.org/10.1007/s10209-009-0149-0
0
comments
Labels:
algorithm,
eye tracker,
inspiration,
low cost,
prototype
Tuesday, August 18, 2009
COGAIN Student Competition Results
Lasse Farnung Laursen, a Ph.D student with the Department of Informatics and Mathematical Modeling at the Technical University of Denmark, won this years COGAIN student competition with the leisure application called GazeTrain.
"GazeTrain (illustrated in the screenshot below) is an action oriented puzzle game, that can be controlled by eye movements. In GazeTrain you must guide a train by placing track tiles in front of it. As you guide the train, you must collect various cargo and drop them off at the nearest city thereby earning money. For further details regarding how to play the game, we encourage you to read the tutorial accessible from the main menu. The game is quite customizable as the dwell time and several other parameters can be adjusted to best suit your play-style." (Source)
"GazeTrain (illustrated in the screenshot below) is an action oriented puzzle game, that can be controlled by eye movements. In GazeTrain you must guide a train by placing track tiles in front of it. As you guide the train, you must collect various cargo and drop them off at the nearest city thereby earning money. For further details regarding how to play the game, we encourage you to read the tutorial accessible from the main menu. The game is quite customizable as the dwell time and several other parameters can be adjusted to best suit your play-style." (Source)
The GazeTrain game.
Runner ups, sharing the second place were
Runner ups, sharing the second place were
Music Editor, developed by Ainhoa Yera Gil, Public University of Navarre, Spain. Music Editor is a gaze-operated application that allows the user to compose, edit and play music by eye movements. The reviewers appreciated it that "a user can not only play but can actually create something" and that "Music Editor is well suited for gaze control".
Gaze Based Sudoku, developed by Juha Hjelm and Mari Pesonen, University of Tampere, Finland. The game can be operated by eye movements and it has three difficulty levels. Reviewers especially appreciated how "the separation between viewing and controlling and between sudoku grid and number selection panel is solved" and that the game "has no time constraints" so it is "relaxing" to play.
0
comments
Labels:
assistive technology,
cogain,
game,
interface design
Tuesday, August 11, 2009
ALS Society of British Columbia announces Engineering Design Awards (Canadian students only)
"The ALS Society of British Columbia has established three Awards to encourage and recognize innovation in technology to substantially improve the quality of life of people living with ALS (Amyotrophic Lateral Sclerosis, also known as Lou Gehrig’s Disease). Students at the undergraduate or graduate level in engineering or a related discipline at a post-secondary institution in British Columbia or elsewhere in Canada are eligible for the Awards. Students may be considered individually or as a team. Mentor Awards may also be given to faculty supervising students who win awards" (see Announcement)
Project ideas:
Project ideas:
- Low-cost eye tracker
- Issue: Current commercial eye-gaze tracking systems cost thousands to tens of thousands of dollars. The high cost of eye-gaze trackers prevents potential users from accessing eye- gaze tracking tools. The hardware components required for eye-gaze tracking do not justify the price and a lower-cost alternative is desirable. Webcams may be used for low-cost imaging, along with simple infrared diodes for system lighting. Alternatively, visible light systems may also be investigated. Opensource eye-gaze tracking software is also available. (ed: ITU GazeTracker, OpenEyes, Track Eye, OpenGazer and MyEye (free, no source)
- Goal: The goal of this design project is to develop a low-cost and usable eye-gaze tracking system based on simple commercial-of-the-shelf hardware.
- Deliverables: A working prototype of a functional, low-cost (< $200), eye-gaze tracking system.
- Issue: Current commercial eye-gaze tracking systems cost thousands to tens of thousands of dollars. The high cost of eye-gaze trackers prevents potential users from accessing eye- gaze tracking tools. The hardware components required for eye-gaze tracking do not justify the price and a lower-cost alternative is desirable. Webcams may be used for low-cost imaging, along with simple infrared diodes for system lighting. Alternatively, visible light systems may also be investigated. Opensource eye-gaze tracking software is also available. (ed: ITU GazeTracker, OpenEyes, Track Eye, OpenGazer and MyEye (free, no source)
- Eye-glasses compensation
- Deliverables: A working prototype of a functional, low-cost (< $200), eye-gaze tracking system
- Issue: The use of eye-glasses can cause considerable problems in eye-gaze tracking. The issue stems from reflections off the eye-glasses due to the use of controlled infrared lighting (on and off axis light sources) used to highlight features of the face. The key features of interest are the pupils and glints (or reflections of the surface of the cornea). Incorrectly identifying the pupils and glints then results in invalid estimation of the point-of-gaze.
- Goal: The goal of this design project is to develop techniques for either: 1) avoiding image corruption with eye-glasses on a commercial eye-gaze tracker, or 2) developing a controlled lighting scheme to ensure valid pupil and glints identification are identified in the presence of eye-glasses.
- Deliverables: Two forms of deliverables are possible: 1) A working prototype illustrating functional eye-gaze tracking in the presence of eye-glasses with a commercial eye-gaze tracker, or 2) A working prototype illustrating accurate real-time identification of the pupil and glints using controlled infrared lighting (on and off axis light sources) in the presence of eye-glasses.
- Innovative selection with ALS and eye gaze
- Issue: As mobility steadily decreases in the more advanced stages of ALS, alternative techniques for selection are required. Current solutions include head switches, sip and puff switches and dwell time activation depending on the degree of mobility loss to name a few. The use of dwell time requires no mobility other than eye motion, however, this technique suffers from ‘lag’ in that the user must wait the dwell time duration for each selection, as well as the ‘midas touch’ problem in which unintended selection if the gaze point is stationary for too long.
- Goal: The goal of this design project is to develop a technique for improved selection with eye-gaze for individuals with only eye-motion available. Possible solutions may involve novel HCI designs for interaction, including various adaptive and predictive technologies, the consideration of contextual cues, and the introduction of ancillary inputs, such as EMG, EEG.
- Deliverables: A working prototype illustrating eye-motion only selection with a commercial eye-gaze tracking system.
- Novel and valuable eye-gaze tracking applications and application enhancements
- Issue: To date, relatively few gaze-tracking applications have been developed. These include relatively simplistic applications such as the tedious typing of words, and even in such systems, little is done to ease the effort required, e.g., systems typically do not allow for the saving and reuse of words and sentences.
- Goal: The goal of this design project is to develop one or more novel applications or application enhancements that take gaze as input, and that provide new efficiencies or capabilities that could significantly improve the quality of life of those living with ALS.
- Deliverables: A working prototype illustrating one or more novel applications that take eye-motion as an input. The prototype must be developed and implemented to the extent that an evaluation of the potential efficiencies and/or reductions in effort can be evaluated by persons living with ALS and others on an evaluation panel.
See the Project Ideas for more information. For contact information see page two of the announcement.
2
comments
Labels:
assistive technology,
eye tracker,
hci,
Midas Touch,
open source,
prototype
Lund Eye-Tracking Academy (LETA)
Kenneth Holmqvist and his team at the Humanities Lab at Lund University, Sweden will host another three day long LETA training course in eye tracking and analysis of eye movement data. This is an excellent opportunity to get hands-on experience using state-of-the art equipment and setting up experiments. The course is held between 23rd-25th September and the registration is open.
Course contents
• Pro and cons of headmounted, remote and contact eye-trackers.
• High sampling speed and detailed precision – who needs it?
• Gaze-overlaid videos vs datafiles – what can you do with them?
• How to set up and calibrate on a variety of subjects on different eye-trackers?
• Glasses, lenses, mascara, and drooping eye-lids – what to do?
• How to work with stimulus programs, and synchronize them with eye-tracking recording?
• How to deal with the consent forms and ethical issues?
• Short introduction to experimental design: Potentials and pitfalls.
• Visualisation of data vs number crunching.
• Fast data analysis of multi-user experiments.
• Fixation durations, saccadic amplitudes, transition diagrams, group similarity measures, and all the other measures – what do they tell us? What are the pitfalls?
Teaching methods
Lectures on selected topics (8h)
Hands-on work in our lab on prespecified experiments: Receiving and recording on a subject (9h). Handson initial data analysis (3h).
Eye-tracking systems available for this training
2*SMI HED 50 Hz with Polhemus Head-tracking
3*SMI HiSpeed 240/1250 Hz
1*SMI RED-X remote 50 Hz
2*SMI HED-mobile 50/200 Hz
Course contents
• Pro and cons of headmounted, remote and contact eye-trackers.
• High sampling speed and detailed precision – who needs it?
• Gaze-overlaid videos vs datafiles – what can you do with them?
• How to set up and calibrate on a variety of subjects on different eye-trackers?
• Glasses, lenses, mascara, and drooping eye-lids – what to do?
• How to work with stimulus programs, and synchronize them with eye-tracking recording?
• How to deal with the consent forms and ethical issues?
• Short introduction to experimental design: Potentials and pitfalls.
• Visualisation of data vs number crunching.
• Fast data analysis of multi-user experiments.
• Fixation durations, saccadic amplitudes, transition diagrams, group similarity measures, and all the other measures – what do they tell us? What are the pitfalls?
Teaching methods
Lectures on selected topics (8h)
Hands-on work in our lab on prespecified experiments: Receiving and recording on a subject (9h). Handson initial data analysis (3h).
Eye-tracking systems available for this training
2*SMI HED 50 Hz with Polhemus Head-tracking
3*SMI HiSpeed 240/1250 Hz
1*SMI RED-X remote 50 Hz
2*SMI HED-mobile 50/200 Hz
0
comments
Labels:
HumLab,
Lund Universitet
Thursday, August 6, 2009
Päivi Majaranta PhD Thesis on Text Entry by Eye Gaze
The most complete publication on gaze typing is now available as Päivi Majaranta at the University of Tampere have successfully defended her PhD thesis. It summarizes previous work and discusses/exemplifies important topics such as word prediction, layout, feedback and user aspects. The material is presented in a straight forward manner with a clear structure and excellent illustrations. It will without doubt be useful for anyone who is about to design and develop a gaze based text entry interface. Congratulations Päivi for such an well written thesis.
- Majaranta, P. (2009) Text Entry by Eye Gaze. Dissertations in Interactive Technology, number 11, University of Tampere (ISBN 978-951-44-7786-7). Also available in Acta Electronica Universitatis Tamperensis; 869 (978-951-44-7787-4).
0
comments
Labels:
assistive technology,
inspiration,
tampere,
typing
Friday, July 31, 2009
SMI RED 250!
Today SMI announced the new RED250 which, as the name suggests, has an impressive 250Hz sampling rate. It has an accuracy of 0.5 degrees or below (typ.), less than 10 ms. latency and operates within 60-80 cm head distance. The track-box is 40x40 cm at 70cm distance and will recover tracking faster than the previous model. No details on pricing yet but top of the line performance never comes cheap. Get the flyer as pdf.
0
comments
Labels:
eye tracker,
SMI IView RED
Survey on gaze visualization in 3D virtual environments
Got an email today from Sophie Stellmach a PhD student from the User Interface & Software Engineering group at the Otto-von-Guericke University in Germany. She has posted an online survey and would like your some input from eye tracking specialists on 3D gaze visualization.
"In the course of my work I have developed several gaze visualizations for facilitating eye tracking studies in (static) three-dimensional virtual environments. In order to evaluate the potential utility of these techniques, I am conducting an online survey with eye tracking researchers and professionals. I would like to invite you to this survey as I think that your answers are highly valuable for this investigation. The survey should take less than 10 minutes of your time! Your answers will be stored anonymously. You can access the survey under the following link: http://gamescience.bth.se/survey/index.php?sid=27319=en "
"In the course of my work I have developed several gaze visualizations for facilitating eye tracking studies in (static) three-dimensional virtual environments. In order to evaluate the potential utility of these techniques, I am conducting an online survey with eye tracking researchers and professionals. I would like to invite you to this survey as I think that your answers are highly valuable for this investigation. The survey should take less than 10 minutes of your time! Your answers will be stored anonymously. You can access the survey under the following link: http://gamescience.bth.se/survey/index.php?sid=27319=en "
Subscribe to:
Posts (Atom)