Friday, April 30, 2010

GazePad: Low-cost remote webcam eye tracking

Came across the GazeLib low-cost remote eye tracking project today which uses ordinary webcams without IR illumination. The accuracy is pretty low but it's really nice to see another low-cost approach for assistive technology.

"GazeLib is a programming library which making real-time low-cost gaze tracking becomes possible. The library provide functions performing remote gaze tracking under ambient lighting condition using a single, low cost, off-the-shelf webcam. Developers can easily build gaze tracking technologies implemented applications in only few lines of code. GazeLib project focuses on promoting gaze tracking technology to consumer-grade human computer interfaces by reducing the price, emphasizing ease-of-use, increasing the extendibility, and enhancing the flexibility and mobility."



Monday, April 26, 2010

Freie Universität Berlin presents gaze controlled car

From the Freie Universität in Berlin comes a working prototype for a systems that allows direct steering by eye movements alone. The prototype was demonstrated in front of a large group journalist at the former Berlin Tempelhof Airport. Gaze data from a head-mounted SMI eye tracker is feed into the control system of the Spirit of Berlin, a platform for autonomous navigation. Similar to the gaze controlled robot we presented at CHI09 the platform offers a coupling between the turning of the wheels and the gaze data coordinate space (eg. look left and car drives left). Essentially its a mapping onto a 2D plane where deviations from the center issues steering commands and the degree of turning is modulated by the distance. Potentially interesting when coupled with other sensors that in combination offers offer driver support, for example if an object in the vehicles path that driver has not seen. Not to mention scenarios including individuals with disabilities and/or machine learning. The work has been carried out under guidance by professor Raúl Rojas as part AutoNOMOS project which has been running since 2006 after inspiration from the Stanford autonomos car project.

More info in the press-release.

Sunday, April 25, 2010

Wednesday, April 14, 2010

Open-source gaze tracker awarded Research Pearls of ITU Copenhagen

The open-source eye tracker ITU Gaze Tracker primarily developed by Javier San Augustin, Henrik Skovsgaard and myself has been awarded the Research Pearls of the IT University of Copenhagen. A presentation will be held at ITU on May 6th at 2pm. The software released one year ago have seen more than 5000 downloads by students and hobbyist around the world. It's rapidly approaching a new release which will offer better performance and stability for remote tracking and many bug fixes in general. The new version adds support for a whole range of new HD web cameras. These provides a vastly improved image quality that finally brings hope for a low-cost, open, flexible and reasonably performing solution. The ambitious goal strives to make eye tracking technology available for everyone, regardless of available resources. Follow the developments at the forum. Additional information is available at the ITU Gaze Group.

"The Open-Source ITU Gaze Tracker"

Abstract:
Gaze tracking offers them the possibility of interacting with a computer by just using eye movements, thereby making users more independent. However, some people (for example users with a severe disability) are excluded from access to gaze interaction due to the high prices of commercial systems (above 10.000€). Gaze tracking systems built from low-cost and off-the-shelf components have the potential of facilitating access to the technology and bring prices down.

The ITU Gaze Tracker is an off-the-shelf system that uses an inexpensive web cam or a video camera to track the user’s eye. It is free and open-source, offering users the possibility of trying out gaze interaction technology for a cost as low as 20€, and to adapt and extend the software to suit specific needs.

In this talk we will present the open-source ITU Gaze Tracker and show the different scenarios in which the system has been used and evaluated.

Monday, April 12, 2010

Digital avatars gets human-like eye movements

William Steptoe of University College London got his research on using eye tracking to give digital avatars human-like eye movements covered in an article by New Scientist. It turns out that "on average, the participants were able to identify 88 per cent of truths correctly when the avatars had eye movement, but only 70 per cent without. Spotting lies was harder, but eye movement helped: 48 per cent accuracy compared with 39 per cent without. Steptoe will present the results at the 2010 Conference on Human Factors in Computing Systems in Atlanta, Georgia, next week."

Eye tracking in the wild: Consumer decision-making process at the supermarket

Kerstin Gidlöf from the Lund University Humlab talks about the visual appearance of consumer products in the supermarket and how the graphical layout modulates our attention. Perhaps the free will is just an illusion, however number of items in my fridge containing faces equals zero. Is it me or the store I'm shopping at?

Monday, March 29, 2010

Text 2.0 gaze assisted reading

From the German Research Center for Artificial Intelligence comes a new demonstration of a gaze based reading system, Text 2.0, which utilizes eye tracking for making the reading experience more dynamic and interactive. For example the system can display images relevant to what your reading about or filter out less relevant information if your skimming through the content. The research is funded through the Stiftung Rheinland-Pfalz für Innovation. On the groups website you can also find an interesting project called PEEP which allows developers to connect eye trackers to Processing which enables aesthetically stunning visualizations. This platform is the core of the Text2.0 platform. Check out the videos.




More information:
Zdf.de: Wenn das auge die seite umblaettert?
Wired: Eye-Tracking Tablets and the Promise of Text 2.0
More demos at the groups website

Low-cost eye tracking and pong gaming from Imperial College London

A group of students at the Imperial College London have develop a low-cost head mounted tracker which they use to play Pong with. The work is carried out under supervision of Aldo Faisal in his lab.

"
We built an eyetracking system using mass-marketed off-the shelf components at 1/1000 of that cost, i.e. for less then 30 GBP. Once we made such a system that cheap we started thinking of it as a user interface for everyday use for impaired people.. The project was enable by realising that certain mass-marketed web cameras for video game consoles offer impressive performance approaching that of much more expensive research grade cameras.



"From this starting point research in our group has focussed on two parts so far:


1. The TED software, which is composed of two components which can run on two different computers (connected by wireless internet) or run on the same computer. The first component is the TED server (Linux-based) which interfaces directly with the cameras and processes the high-speed video feed and makes the data available (over the internet) to the client software. The client forms the second components, it is written in Java (i.e. it runs on any computer, Windows, Mac, Unix, ...) and provides the Mouse-control-via-eye-movements, the “Pong” video game as well as configuration and calibration functions.

This two part solution allows the cameras to be connected to a cost-effective netbook (e.g. on a wheel chair) and allow control of other computers over the internet (e.g. in the living room, office and kitchen). This software suite, as well as part of the low-level camera driver was implemented by Ian Beer, Aaron Berk, Oliver Rogers and Timothy Treglown, for their undergraduate project in the lab.

Note:the “Pong” video game has a two player mode, allowing two people to play against each other using two eye-trackers or eye-tracker vs keyboard. It is very easy to use, just look where you want the pong paddle to move...

2. The camera-spectacles (visible in most press photos), as well as a two-camera software (Windows-based) able to track eye-movements in 3D (i.e. direction and distance) for wheelchair control. These have been build and developed by William Abbott (Dept. of Bioengineering)."

Further reading:

Imperial College London press release: Playing “Pong” with the blink of an eye
The Engineer: Eye-movement game targets disabled
Engadget (German): Neurotechnologie: Pong mit Augenblinzeln gespielt in London

Friday, March 26, 2010

ETRA 2010 Proceedings now online

The proceedings of the 2010 Symposium on Eye-Tracking Research Applications 2010, Austin, Texas March 22 - 24, 2010 is now online. Some kind soul (MrGaze?) decided to do the world a favor by uploading and keyword-tagging the papers onto the Slideshare website which is indexed by Google and other search engines. The wealth of information ensures days of interesting reading, several short papers and posters would have been interesting to hear a talk on, but as always time is short.

Paper Acceptance Rate: 18.00 of 58.00 submissions, 31%

Conference chair


Carlos Hitoshi Morimoto
University of Sao Paulo, Brazil
Howell Istance De Montfort University, UK
Program chairs
Aulikki Hyrskykari
University of Tampere, Finland
Qiang Ji
Rensselaer Polytechnic Institute


Table of Contents

Front matter (cover, title page, table of content, preface)

Back matter (committees and reviewers, industrial supporters, cover image credits, author index)

SESSION: Keynote address





An eye on input: research challenges in using the eye for computer input control
I. Scott MacKenzie

PdfPdf (1.52 MB). View online.
Additional Information: full citation, abstract



SESSION:
Long papers 1 -- Advances in eye tracking technology


Homography normalization for robust gaze estimation in uncalibrated setups
Dan Witzner Hansen, Javier San Agustin, Arantxa Villanueva

PdfPdf(942 KB). View online.
Additional Information: full citation, abstract, references






Head-mounted eye-tracking of infants' natural interactions: a new method
John M. Franchak, Kari S. Kretch, Kasey C. Soska, Jason S. Babcock, Karen E. Adolph (awarded best paper)

PdfPdf (3.68 MB). View online.
Additional Information: full citation, abstract, references




User-calibration-free remote gaze estimation system
Dmitri Model, Moshe Eizenman

PdfPdf (452 KB). View online.

Additional Information: full citation, abstract, references



SESSION:
Short papers 1 -- Eye tracking applications and data analysis


Eye movement as an interaction mechanism for relevance feedback in a content-based image retrieval system
Yun Zhang, Hong Fu, Zhen Liang, Zheru Chi, Dagan Feng

PdfPdf (1.20 MB). View online.




Content-based image retrieval using a combination of visual features and eye tracking data
Zhen Liang, Hong Fu, Yun Zhang, Zheru Chi, Dagan Feng

PdfPdf(877 KB). View online.




Gaze scribing in physics problem solving
David Rosengrant

PdfPdf (268 KB). View online.




Have you seen any of these men?: looking at whether eyewitnesses use scanpaths to recognize suspects in photo lineups
Sheree Josephson, Michael E. Holmes

PdfPdf (660 KB). View online.
Additional Information: full citation, abstract, references





Estimation of viewer's response for contextual understanding of tasks using features of eye-movements
Minoru Nakayama, Yuko Hayashi

PdfPdf(152 KB). View online.




Biometric identification via an oculomotor plant mathematical model
Oleg V. Komogortsev, Sampath Jayarathna, Cecilia R. Aragon, Mechehoul Mahmoud

PdfPdf (301 KB). View online.


POSTER SESSION:
Short papers 2 -- Poster presentations








Saliency-based decision support
Roxanne L. Canosa

PdfPdf (177 KB). View online.
Additional Information: full citation, abstract, references











Qualitative and quantitative scoring and evaluation of the eye movement classification algorithms
Oleg V. Komogortsev, Sampath Jayarathna, Do Hyong Koh, Sandeep Munikrishne Gowda

PdfPdf (383 KB). View online.








An interactive interface for remote administration of clinical tests based on eye tracking
A. Faro, D. Giordano, C. Spampinato, D. De Tommaso, S. Ullo

PdfPdf (769 KB). View online.
Additional Information: full citation, abstract, references









Visual attention for implicit relevance feedback in a content based image retrieval
A. Faro, D. Giordano, C. Pino, C. Spampinato

PdfPdf (4.98 MB). View online.
Additional Information: full citation, abstract, references









Evaluation of a low-cost open-source gaze tracker
Javier San Agustin, Henrik Skovsgaard, Emilie Mollenbach, Maria Barret, Martin Tall, Dan Witzner Hansen, John Paulin Hansen

PdfPdf (287 KB). View online.








An open source eye-gaze interface: expanding the adoption of eye-gaze in everyday applications
Craig Hennessey, Andrew T. Duchowski

PdfPdf(390 KB). View online.
Additional Information: full citation, abstract, references









Using eye tracking to investigate important cues for representative creature motion
Meredith McLendon, Ann McNamara, Tim McLaughlin, Ravindra Dwivedi

PdfPdf (661 KB). View online.
Additional Information: full citation, abstract, references









Eye and pointer coordination in search and selection tasks
Hans-Joachim Bieg, Lewis L. Chuang, Roland W. Fleming, Harald Reiterer, Heinrich H. Bülthoff

PdfPdf(934 KB). View online.








Pies with EYEs: the limits of hierarchical pie menus in gaze control
Mario H. Urbina, Maike Lorenz, Anke Huckauf

PdfPdf (957 KB). View online.








Measuring vergence over stereoscopic video with a remote eye tracker
Brian C. Daugherty, Andrew T. Duchowski, Donald H. House, Celambarasan Ramasamy

PdfPdf (1.78 MB). View online.








Group-wise similarity and classification of aggregate scanpaths
Thomas Grindinger, Andrew T. Duchowski, Michael Sawyer

PdfPdf (2.82 MB). View online.








Inferring object relevance from gaze in dynamic scenes
Melih Kandemir, Veli-Matti Saarinen, Samuel Kaski

PdfPdf (240 KB). View online.








Advanced gaze visualizations for three-dimensional virtual environments
Sophie Stellmach, Lennart Nacke, Raimund Dachselt

PdfPdf (7.33 MB). View online.








The use of eye tracking for PC energy management
Vasily G. Moshnyaga

PdfPdf (413 KB). View online.








Low-latency combined eye and head tracking system for teleoperating a robotic head in real-time
Stefan Kohlbecher, Klaus Bartl, Stanislavs Bardins, Erich Schneider

PdfPdf (326 KB). View online.
Additional Information: full citation, abstract, references









Visual search in the (un)real world: how head-mounted displays affect eye movements, head movements and target detection
Tobit Kollenberg, Alexander Neumann, Dorothe Schneider, Tessa-Karina Tews, Thomas Hermann, Helge Ritter, Angelika Dierker, Hendrik Koesling

PdfPdf (479 KB). View online.








Visual span and other parameters for the generation of heatmaps
Pieter Blignaut

PdfPdf (1.01 MB). View online.



Robust optical eye detection during head movement
Jeffrey B. Mulligan, Kevin N. Gabayan

PdfPdf (325 KB). View online.
Additional Information: full citation, abstract, references










What you see is where you go: testing a gaze-driven power wheelchair for individuals with severe multiple disabilities
Erik Wästlund, Kay Sponseller, Ola Pettersson

PdfPdf (611 KB). View online.








A depth compensation method for cross-ratio based eye tracking
Flavio L. Coutinho, Carlos H. Morimoto

PdfPdf (311 KB). View online.








Estimating cognitive load using remote eye tracking in a driving simulator
Oskar Palinko, Andrew L. Kun, Alexander Shyrokov, Peter Heeman

PdfPdf (402 KB). View online.








Small-target selection with gaze alone
Henrik Skovsgaard, Julio C. Mateo, John M. Flach, John Paulin Hansen

PdfPdf (274 KB). View online.








Measuring situation awareness of surgeons in laparoscopic training
Geoffrey Tien, M. Stella Atkins, Bin Zheng, Colin Swindells

PdfPdf (810 KB). View online.








Quantification of aesthetic viewing using eye-tracking technology: the influence of previous training in apparel design
Juyeon Park, Emily Woods, Marilyn DeLong

PdfPdf (3.98 MB). View online.
Additional Information: full citation, abstract, references










Estimating 3D point-of-regard and visualizing gaze trajectories under natural head movements
Kentaro Takemura, Yuji Kohashi, Tsuyoshi Suenaga, Jun Takamatsu, Tsukasa Ogasawara

PdfPdf (443 KB). View online.








Natural scene statistics at stereo fixations
Yang Liu, Lawrence K. Cormack, Alan C. Bovik

PdfPdf (250 KB). View online.
Additional Information: full citation, abstract, references










Development of eye-tracking pen display based on stereo bright pupil technique
Michiya Yamamoto, Takashi Nagamatsu, Tomio Watanabe

PdfPdf (988 KB). View online.








Pupil center detection in low resolution images
Detlev Droege, Dietrich Paulus

PdfPdf (499 KB). View online.



Using vision and voice to create a multimodal interface for Microsoft Word 2007
T. R. Beelders, P. J. Blignaut

PdfPdf (237 KB). View online.
Additional Information: full citation, abstract, references




Single gaze gestures
Emilie Møllenbach, Martin Lillholm, Alastair Gail, John Paulin Hansen

PdfPdf (279 KB). View online.



Learning relevant eye movement feature spaces across users
Zakria Hussain, Kitsuchart Pasupa, John Shawe-Taylor

PdfPdf (788 KB). View online.



Towards task-independent person authentication using eye movement signals
Tomi Kinnunen, Filip Sedlak, Roman Bednarik

PdfPdf (373 KB). View online.



Gaze-based web search: the impact of interface design on search result selection
Yvonne Kammerer, Wolfgang Beinhauer

PdfPdf (347 KB). View online.



Eye tracking with the adaptive optics scanning laser ophthalmoscope
Scott B. Stevenson, Austin Roorda, Girish Kumar

PdfPdf (1.35 MB). View online.
Additional Information: full citation, abstract, references




Listing's and Donders' laws and the estimation of the point-of-gaze
Elias D. Guestrin, Moshe Eizenman

PdfPdf(304 KB). View online.
Additional Information: full citation, abstract, references



SESSION:
Long papers 2 -- Scanpath representation and comparison methods


Visual scanpath representation
Joseph H. Goldberg, Jonathan I. Helfman

PdfPdf (1.68 MB). View online.

A vector-based, multidimensional scanpath similarity measure
Halszka Jarodzka, Kenneth Holmqvist, Marcus Nyström

PdfPdf (425 KB). View online.

Scanpath comparison revisited
Andrew T. Duchowski, Jason Driver, Sheriff Jolaoso, William Tan, Beverly N. Ramey, Ami Robbins

PdfPdf (1.34 MB). View online.


SESSION:
Long papers 3 -- Analysis and interpretation of eye movements


Scanpath clustering and aggregation
Joseph H. Goldberg, Jonathan I. Helfman

PdfPdf (636 KB). View online.

Match-moving for area-based analysis of eye movements in natural tasks
Wayne J. Ryan, Andrew T. Duchowski, Ellen A. Vincent, Dina Battisto

PdfPdf (10.41 MB). View online.

Interpretation of geometric shapes: an eye movement study
Miquel Prats, Steve Garner, Iestyn Jowers, Alison McKay, Nieves Pedreira

PdfPdf (1.73 MB). View online.


SESSION:
Short papers 3 -- Advances in eye tracking technology


User-calibration-free gaze tracking with estimation of the horizontal angles between the visual and the optical axes of both eyes
Takashi Nagamatsu, Ryuichi Sugano, Yukina Iwamoto, Junzo Kamahara, Naoki Tanaka

PdfPdf (573 KB). View online.

Gaze estimation method based on an aspherical model of the cornea: surface of revolution about the optical axis of the eye
Takashi Nagamatsu, Yukina Iwamoto, Junzo Kamahara, Naoki Tanaka, Michiya Yamamoto

PdfPdf (269 KB). View online.

The pupillometric precision of a remote video eye tracker
Jeff Klingner

PdfPdf (3.80 MB). View online.

Contingency evaluation of gaze-contingent displays for real-time visual field simulations
Margarita Vinnikov, Robert S. Allison

PdfPdf (226 KB). View online.
Additional Information: full citation, abstract, references



SemantiCode: using content similarity and database-driven matching to code wearable eyetracker gaze data
Daniel F. Pontillo, Thomas B. Kinsman, Jeff B. Pelz

Pages: 267-270 PdfPdf (2.34 MB). View online.
Additional Information: full citation, abstract, references





Context switching for fast key selection in text entry applications
Carlos H. Morimoto, Arnon Amir
Pages: 271-274 PdfPdf (1.24 MB). View online.



SESSION:
Long papers 4 -- Analysis and understanding of visual tasks


Fixation-aligned pupillary response averaging
Jeff Klingner

PdfPdf (935 KB). View online.

Understanding the benefits of gaze enhanced visual search
Pernilla Qvarfordt, Jacob T. Biehl, Gene Golovchinsky, Tony Dunningan

Pages: 283-290 PdfPdf (694 KB). View online.




Image ranking with implicit feedback from eye movements
David R. Hardoon, Kitsuchart Pasupa

PdfPdf (409 KB). View online.


SESSION:
Long papers 5 -- Gaze interfaces and interactions


How the interface design influences users' spontaneous trustworthiness evaluations of web search results: comparing a list and a grid interface
Yvonne Kammerer, Peter Gerjets

PdfPdf (349 KB). View online.




Space-variant spatio-temporal filtering of video for gaze visualization and perceptual learning
Michael Dorr, Halszka Jarodzka, Erhardt Barth

PdfPdf (188 KB). View online.




Alternatives to single character entry and dwell time selection on eye typing
Mario H. Urbina, Anke Huckauf

PdfPdf (802 KB). View online.


SESSION:
Long papers 6 -- Eye tracking and accessibility


Designing gaze gestures for gaming: an investigation of performance
Howell Istance, Aulikki Hyrskykari, Lauri Immonen, Santtu Mansikkamaa, Stephen Vickers

PdfPdf (760 KB). View online.




ceCursor, a contextual eye cursor for general pointing in windows environments
Marco Porta, Alice Ravarelli, Giovanni Spagnoli

PdfPdf (884 KB). View online.





BlinkWrite2: an improved text entry method using eye blinks
Behrooz Ashtiani, I. Scott MacKenzie

PdfPdf (1.50 MB). View online.