Friday, March 26, 2010

ETRA 2010 Proceedings now online

The proceedings of the 2010 Symposium on Eye-Tracking Research Applications 2010, Austin, Texas March 22 - 24, 2010 is now online. Some kind soul (MrGaze?) decided to do the world a favor by uploading and keyword-tagging the papers onto the Slideshare website which is indexed by Google and other search engines. The wealth of information ensures days of interesting reading, several short papers and posters would have been interesting to hear a talk on, but as always time is short.

Paper Acceptance Rate: 18.00 of 58.00 submissions, 31%

Conference chair


Carlos Hitoshi Morimoto
University of Sao Paulo, Brazil
Howell Istance De Montfort University, UK
Program chairs
Aulikki Hyrskykari
University of Tampere, Finland
Qiang Ji
Rensselaer Polytechnic Institute


Table of Contents

Front matter (cover, title page, table of content, preface)

Back matter (committees and reviewers, industrial supporters, cover image credits, author index)

SESSION: Keynote address





An eye on input: research challenges in using the eye for computer input control
I. Scott MacKenzie

PdfPdf (1.52 MB). View online.
Additional Information: full citation, abstract



SESSION:
Long papers 1 -- Advances in eye tracking technology


Homography normalization for robust gaze estimation in uncalibrated setups
Dan Witzner Hansen, Javier San Agustin, Arantxa Villanueva

PdfPdf(942 KB). View online.
Additional Information: full citation, abstract, references






Head-mounted eye-tracking of infants' natural interactions: a new method
John M. Franchak, Kari S. Kretch, Kasey C. Soska, Jason S. Babcock, Karen E. Adolph (awarded best paper)

PdfPdf (3.68 MB). View online.
Additional Information: full citation, abstract, references




User-calibration-free remote gaze estimation system
Dmitri Model, Moshe Eizenman

PdfPdf (452 KB). View online.

Additional Information: full citation, abstract, references



SESSION:
Short papers 1 -- Eye tracking applications and data analysis


Eye movement as an interaction mechanism for relevance feedback in a content-based image retrieval system
Yun Zhang, Hong Fu, Zhen Liang, Zheru Chi, Dagan Feng

PdfPdf (1.20 MB). View online.




Content-based image retrieval using a combination of visual features and eye tracking data
Zhen Liang, Hong Fu, Yun Zhang, Zheru Chi, Dagan Feng

PdfPdf(877 KB). View online.




Gaze scribing in physics problem solving
David Rosengrant

PdfPdf (268 KB). View online.




Have you seen any of these men?: looking at whether eyewitnesses use scanpaths to recognize suspects in photo lineups
Sheree Josephson, Michael E. Holmes

PdfPdf (660 KB). View online.
Additional Information: full citation, abstract, references





Estimation of viewer's response for contextual understanding of tasks using features of eye-movements
Minoru Nakayama, Yuko Hayashi

PdfPdf(152 KB). View online.




Biometric identification via an oculomotor plant mathematical model
Oleg V. Komogortsev, Sampath Jayarathna, Cecilia R. Aragon, Mechehoul Mahmoud

PdfPdf (301 KB). View online.


POSTER SESSION:
Short papers 2 -- Poster presentations








Saliency-based decision support
Roxanne L. Canosa

PdfPdf (177 KB). View online.
Additional Information: full citation, abstract, references











Qualitative and quantitative scoring and evaluation of the eye movement classification algorithms
Oleg V. Komogortsev, Sampath Jayarathna, Do Hyong Koh, Sandeep Munikrishne Gowda

PdfPdf (383 KB). View online.








An interactive interface for remote administration of clinical tests based on eye tracking
A. Faro, D. Giordano, C. Spampinato, D. De Tommaso, S. Ullo

PdfPdf (769 KB). View online.
Additional Information: full citation, abstract, references









Visual attention for implicit relevance feedback in a content based image retrieval
A. Faro, D. Giordano, C. Pino, C. Spampinato

PdfPdf (4.98 MB). View online.
Additional Information: full citation, abstract, references









Evaluation of a low-cost open-source gaze tracker
Javier San Agustin, Henrik Skovsgaard, Emilie Mollenbach, Maria Barret, Martin Tall, Dan Witzner Hansen, John Paulin Hansen

PdfPdf (287 KB). View online.








An open source eye-gaze interface: expanding the adoption of eye-gaze in everyday applications
Craig Hennessey, Andrew T. Duchowski

PdfPdf(390 KB). View online.
Additional Information: full citation, abstract, references









Using eye tracking to investigate important cues for representative creature motion
Meredith McLendon, Ann McNamara, Tim McLaughlin, Ravindra Dwivedi

PdfPdf (661 KB). View online.
Additional Information: full citation, abstract, references









Eye and pointer coordination in search and selection tasks
Hans-Joachim Bieg, Lewis L. Chuang, Roland W. Fleming, Harald Reiterer, Heinrich H. Bülthoff

PdfPdf(934 KB). View online.








Pies with EYEs: the limits of hierarchical pie menus in gaze control
Mario H. Urbina, Maike Lorenz, Anke Huckauf

PdfPdf (957 KB). View online.








Measuring vergence over stereoscopic video with a remote eye tracker
Brian C. Daugherty, Andrew T. Duchowski, Donald H. House, Celambarasan Ramasamy

PdfPdf (1.78 MB). View online.








Group-wise similarity and classification of aggregate scanpaths
Thomas Grindinger, Andrew T. Duchowski, Michael Sawyer

PdfPdf (2.82 MB). View online.








Inferring object relevance from gaze in dynamic scenes
Melih Kandemir, Veli-Matti Saarinen, Samuel Kaski

PdfPdf (240 KB). View online.








Advanced gaze visualizations for three-dimensional virtual environments
Sophie Stellmach, Lennart Nacke, Raimund Dachselt

PdfPdf (7.33 MB). View online.








The use of eye tracking for PC energy management
Vasily G. Moshnyaga

PdfPdf (413 KB). View online.








Low-latency combined eye and head tracking system for teleoperating a robotic head in real-time
Stefan Kohlbecher, Klaus Bartl, Stanislavs Bardins, Erich Schneider

PdfPdf (326 KB). View online.
Additional Information: full citation, abstract, references









Visual search in the (un)real world: how head-mounted displays affect eye movements, head movements and target detection
Tobit Kollenberg, Alexander Neumann, Dorothe Schneider, Tessa-Karina Tews, Thomas Hermann, Helge Ritter, Angelika Dierker, Hendrik Koesling

PdfPdf (479 KB). View online.








Visual span and other parameters for the generation of heatmaps
Pieter Blignaut

PdfPdf (1.01 MB). View online.



Robust optical eye detection during head movement
Jeffrey B. Mulligan, Kevin N. Gabayan

PdfPdf (325 KB). View online.
Additional Information: full citation, abstract, references










What you see is where you go: testing a gaze-driven power wheelchair for individuals with severe multiple disabilities
Erik Wästlund, Kay Sponseller, Ola Pettersson

PdfPdf (611 KB). View online.








A depth compensation method for cross-ratio based eye tracking
Flavio L. Coutinho, Carlos H. Morimoto

PdfPdf (311 KB). View online.








Estimating cognitive load using remote eye tracking in a driving simulator
Oskar Palinko, Andrew L. Kun, Alexander Shyrokov, Peter Heeman

PdfPdf (402 KB). View online.








Small-target selection with gaze alone
Henrik Skovsgaard, Julio C. Mateo, John M. Flach, John Paulin Hansen

PdfPdf (274 KB). View online.








Measuring situation awareness of surgeons in laparoscopic training
Geoffrey Tien, M. Stella Atkins, Bin Zheng, Colin Swindells

PdfPdf (810 KB). View online.








Quantification of aesthetic viewing using eye-tracking technology: the influence of previous training in apparel design
Juyeon Park, Emily Woods, Marilyn DeLong

PdfPdf (3.98 MB). View online.
Additional Information: full citation, abstract, references










Estimating 3D point-of-regard and visualizing gaze trajectories under natural head movements
Kentaro Takemura, Yuji Kohashi, Tsuyoshi Suenaga, Jun Takamatsu, Tsukasa Ogasawara

PdfPdf (443 KB). View online.








Natural scene statistics at stereo fixations
Yang Liu, Lawrence K. Cormack, Alan C. Bovik

PdfPdf (250 KB). View online.
Additional Information: full citation, abstract, references










Development of eye-tracking pen display based on stereo bright pupil technique
Michiya Yamamoto, Takashi Nagamatsu, Tomio Watanabe

PdfPdf (988 KB). View online.








Pupil center detection in low resolution images
Detlev Droege, Dietrich Paulus

PdfPdf (499 KB). View online.



Using vision and voice to create a multimodal interface for Microsoft Word 2007
T. R. Beelders, P. J. Blignaut

PdfPdf (237 KB). View online.
Additional Information: full citation, abstract, references




Single gaze gestures
Emilie Møllenbach, Martin Lillholm, Alastair Gail, John Paulin Hansen

PdfPdf (279 KB). View online.



Learning relevant eye movement feature spaces across users
Zakria Hussain, Kitsuchart Pasupa, John Shawe-Taylor

PdfPdf (788 KB). View online.



Towards task-independent person authentication using eye movement signals
Tomi Kinnunen, Filip Sedlak, Roman Bednarik

PdfPdf (373 KB). View online.



Gaze-based web search: the impact of interface design on search result selection
Yvonne Kammerer, Wolfgang Beinhauer

PdfPdf (347 KB). View online.



Eye tracking with the adaptive optics scanning laser ophthalmoscope
Scott B. Stevenson, Austin Roorda, Girish Kumar

PdfPdf (1.35 MB). View online.
Additional Information: full citation, abstract, references




Listing's and Donders' laws and the estimation of the point-of-gaze
Elias D. Guestrin, Moshe Eizenman

PdfPdf(304 KB). View online.
Additional Information: full citation, abstract, references



SESSION:
Long papers 2 -- Scanpath representation and comparison methods


Visual scanpath representation
Joseph H. Goldberg, Jonathan I. Helfman

PdfPdf (1.68 MB). View online.

A vector-based, multidimensional scanpath similarity measure
Halszka Jarodzka, Kenneth Holmqvist, Marcus Nyström

PdfPdf (425 KB). View online.

Scanpath comparison revisited
Andrew T. Duchowski, Jason Driver, Sheriff Jolaoso, William Tan, Beverly N. Ramey, Ami Robbins

PdfPdf (1.34 MB). View online.


SESSION:
Long papers 3 -- Analysis and interpretation of eye movements


Scanpath clustering and aggregation
Joseph H. Goldberg, Jonathan I. Helfman

PdfPdf (636 KB). View online.

Match-moving for area-based analysis of eye movements in natural tasks
Wayne J. Ryan, Andrew T. Duchowski, Ellen A. Vincent, Dina Battisto

PdfPdf (10.41 MB). View online.

Interpretation of geometric shapes: an eye movement study
Miquel Prats, Steve Garner, Iestyn Jowers, Alison McKay, Nieves Pedreira

PdfPdf (1.73 MB). View online.


SESSION:
Short papers 3 -- Advances in eye tracking technology


User-calibration-free gaze tracking with estimation of the horizontal angles between the visual and the optical axes of both eyes
Takashi Nagamatsu, Ryuichi Sugano, Yukina Iwamoto, Junzo Kamahara, Naoki Tanaka

PdfPdf (573 KB). View online.

Gaze estimation method based on an aspherical model of the cornea: surface of revolution about the optical axis of the eye
Takashi Nagamatsu, Yukina Iwamoto, Junzo Kamahara, Naoki Tanaka, Michiya Yamamoto

PdfPdf (269 KB). View online.

The pupillometric precision of a remote video eye tracker
Jeff Klingner

PdfPdf (3.80 MB). View online.

Contingency evaluation of gaze-contingent displays for real-time visual field simulations
Margarita Vinnikov, Robert S. Allison

PdfPdf (226 KB). View online.
Additional Information: full citation, abstract, references



SemantiCode: using content similarity and database-driven matching to code wearable eyetracker gaze data
Daniel F. Pontillo, Thomas B. Kinsman, Jeff B. Pelz

Pages: 267-270 PdfPdf (2.34 MB). View online.
Additional Information: full citation, abstract, references





Context switching for fast key selection in text entry applications
Carlos H. Morimoto, Arnon Amir
Pages: 271-274 PdfPdf (1.24 MB). View online.



SESSION:
Long papers 4 -- Analysis and understanding of visual tasks


Fixation-aligned pupillary response averaging
Jeff Klingner

PdfPdf (935 KB). View online.

Understanding the benefits of gaze enhanced visual search
Pernilla Qvarfordt, Jacob T. Biehl, Gene Golovchinsky, Tony Dunningan

Pages: 283-290 PdfPdf (694 KB). View online.




Image ranking with implicit feedback from eye movements
David R. Hardoon, Kitsuchart Pasupa

PdfPdf (409 KB). View online.


SESSION:
Long papers 5 -- Gaze interfaces and interactions


How the interface design influences users' spontaneous trustworthiness evaluations of web search results: comparing a list and a grid interface
Yvonne Kammerer, Peter Gerjets

PdfPdf (349 KB). View online.




Space-variant spatio-temporal filtering of video for gaze visualization and perceptual learning
Michael Dorr, Halszka Jarodzka, Erhardt Barth

PdfPdf (188 KB). View online.




Alternatives to single character entry and dwell time selection on eye typing
Mario H. Urbina, Anke Huckauf

PdfPdf (802 KB). View online.


SESSION:
Long papers 6 -- Eye tracking and accessibility


Designing gaze gestures for gaming: an investigation of performance
Howell Istance, Aulikki Hyrskykari, Lauri Immonen, Santtu Mansikkamaa, Stephen Vickers

PdfPdf (760 KB). View online.




ceCursor, a contextual eye cursor for general pointing in windows environments
Marco Porta, Alice Ravarelli, Giovanni Spagnoli

PdfPdf (884 KB). View online.





BlinkWrite2: an improved text entry method using eye blinks
Behrooz Ashtiani, I. Scott MacKenzie

PdfPdf (1.50 MB). View online.

Friday, March 19, 2010

In the Eye of the Beholder: A Survey of Models for Eyes and Gaze (Hansen&Ji, 2010)

The following paper by Dan Witzner Hansen from ITU Copenhagen and Qiang Ji of Rensselaer Polytechnic Institute surveys and summarizes most existing methods of eye tracking and explains how they operate and what the pros/cons of each methods are. It is one of the most comprehensive publication I've seen on the topic and a delight to read. It was featured in the March edition of IEEE Transactions on Pattern Analysis and Machine Intelligence.

Abstract
"Despite active research and significant progress in the last 30 years, eye detection and tracking remains challenging due to the individuality of eyes, occlusion, variability in scale, location, and light conditions. Data on eye location and details of eye movements have numerous applications and are essential in face detection, biometric identification, and particular human-computer interaction tasks. This paper reviews current progress and state of the art in video-based eye detection and tracking in order to identify promising techniques as well as issues to be further addressed. We present a detailed review of recent eye models and techniques for eye detection and tracking. We also survey methods for gaze estimation and compare them based on their geometric properties and reported accuracies. This review shows that, despite their apparent simplicity, the development of a general eye detection technique involves addressing many challenges, requires further theoretical developments, and is consequently of interest to many other domains problems in computer vision and beyond."

Comparison of gaze estimation methods with respective prerequisites and reported accuracies


Eye Detection models

  • Dan Witzner Hansen, Qiang Ji, "In the Eye of the Beholder: A Survey of Models for Eyes and Gaze," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 32, no. 3, pp. 478-500, Jan. 2010, doi:10.1109/TPAMI.2009.30. Download as PDF.

Thursday, March 18, 2010

GM Automotive heads-up display

General Motors today presents a new automotive heads up display system that has been developed in conjunction with Carnegie Mellon and the University of Southern California. It employs a number of sensors that coupled with object and pattern recognition could assist the driver by projecting information directly onto the windshield. For example the system could assist in navigation by highlighting road signs and emphasis the lanes/edges of the road in difficult driving conditions (rain, snow, fog). Inside the car the system uses an eye tracking solution provided by Swedish firm Smart Eye. Their Smart Eye Pro 5.4 employs several cameras (three in the demonstration, max 6) and infrared illumination to provide 6 degrees of freedom head tracking and 2D eye tracking, both with a (reported) 0.5 degree accuracy. The firm reports that the system provides "immunity to difficult light conditions, including darkness and rapidly varying sunlight" however to what extent this is true for direct facial sunlight remains to be seen. However, over time technical issues are to be overcome, its exciting to see that eye tracking is considered for everyday applications (in a not so distant future). They are not the only ones working on this right now.




Sources:
GM Media "GM Reimagines Head-Up Display Technology"
Engadget.com "GM shows off sensor-laden windshield, new heads-up display prototype"
TG Daily "GM develops HUD system for vehicle windshields"

Tuesday, January 26, 2010

ETRA 2k10 Program Announced!

The awaited program for this years Eye Tracking Research and Applications symposium held in Austin, Texas, 22nd-24th March has been announced. The biennial get-together for leading research on eye movement research targeting computer scientists, engineers and behavioral researchers, is organized in conjunction with the European Communication By Gaze Interaction (COGAIN) association. Which emphasizes a certain focus on gaze-based interaction for individuals with physical motor control disabilities. This years keynote will be given by Scott MacKenzie, Associate Professor of Computer Science and Engineering at York University, Canada.

The long papers section contains 18 entries reflecting the various areas of eye gaze research, from eye tracking, data analysis, visualization, cognitive studies, and interaction & control. In addition, the long list of short papers and a full poster section will ensure a worthwhile event for anyone interested in eye movement related research.

Update: Official detailed program now available.
Update: The papers are now available online.

Looking forward to meeting you there!

Eye tracking & technical achievements

Full papers:
  • Homography Normalization for Robust Gaze Estimation in Uncalibrated Setups
    Dan Witzner Hansen, Javier San Agustin, and Arantxa Villanueva.
    Full paper.

  • Head-Mounted Eye-Tracking of Infants’ Natural Interactions: A New Method
    John Franchak, Kari Kretch, Kasey Soska, Jason Babcock, and Karen Adolph. Full paper.

  • User-Calibration-Free Remote Gaze Estimation System
    Dmitri Model and Moshe Eizenman. Full paper.
Short papers:
  • The Pupillometric Precision of a Remote Video Eye Tracker
    Jeff Klingner. Short paper.

  • Biometric Identification via an Oculomotor Plant Mathematical Model
    Oleg Komogortsev, Sampath Jayarathna, Cecilia Aragon, and Mechehoul Mahmoud. Short paper.

  • SemantiCode: Using Content Similarity and Database-driven Matching to Code Wearable Eyetracker Gaze Data
    Daniel Pontillo, Thomas Kinsman, and Jeff Pelz
    . Short paper.

  • Gaze Estimation Method based on an Aspherical Model of the Cornea: Surface of Revolution about the Optical Axis of the Eye
    Takashi Nagamatsu, Yukina Iwamoto, Junzo Kamahara, Naoki Tanaka, and Michiya Yamamoto. Short paper.

  • User-calibration-free Gaze Tracking with Estimation of the Horizontal Angles between the Visual and the Optical Axes of Both Eyes
    Takashi Nagamatsu, Ryuichi Sugano, Yukina Iwamoto, Junzo Kamahara, and Naoki Tanaka. Short paper.

    Posters:

  • Evaluation of a Low-Cost Open-Source Gaze Tracker
    John Hansen, Dan Witzner Hansen, Emilie Møllenbach, Martin Tall, Javier San Agustin, Maria Barrett, and Henrik Skovsgaard. Poster.

  • Measuring Vergence Over Stereoscopic Video with a Remote Eye Tracker
    Brian Daugherty, Andrew Duchowski, Donald House, and Celambarasan Ramasamy. Poster.

  • Learning Relevant Eye Movement Feature Spaces Across Users.
    Zakria Hussain, Kitsuchart Pasupa, and John Shawe-Taylor
    . Poster.

  • Interactive Interface for Remote Administration of Clinical Tests Based on Eye Tracking
    Alberto Faro, Daniela Giordano, Concetto Spampinato, Davide De Tommaso, and Simona Ullo. Poster.

  • Robust Optical Eye Detection During Head Movement
    Jeffrey Mulligan and Kevin Gabayan. Poster.

  • Estimating 3D Point-of-regard and Visualizing Gaze Trajectories under Natural Head Movements
    Kentaro Takemura, Yuji Kohashi, Tsuyoshi Suenaga, Jun Takamatsu, and Tsukasa Ogasawara. Poster.

  • Eye Tracking with the Adaptive Optics Scanning Laser Ophthalmoscope
    Scott Stevenson, Austin Roorda, and Girish Kumar. Poster.

  • A Depth Compensation Method for Cross-Ratio Based Eye Tracking
    Flavio L. Coutinho and Carlos H. Morimoto. Poster.

  • Pupil Center Detection in Low Resolution Images
    Detlev Droege and Dietrich Paulus. Poster.

  • Development of Eye-Tracking Pen Display Based on Stereo Bright Pupil Technique
    Michiya Yamamoto, Takashi Nagamatsu, and Tomio Watanabe. Poster.

  • The Use of Eye Tracking for PC Energy Management
    Vasily Moshnyaga

  • Listing's and Donders' Laws and the Estimation of the Point-of-Gaze
    Elias Guestrin and Moshe Eizenman

Data processing & eye movement detection

Full papers:
  • A Vector-Based, Multi-Dimensional Scanpath Similiarty Measure
    Halszka Jarodzka, Kenneth Holmqvist, and Marcus Nyström.
    Full paper.

  • Match-Moving for Area-Based Analysis of Eye Movements in Natural Tasks
    Andrew Duchowski, Wayne Ryan, Ellen Vincent, and Dina Battisto. Full paper.

  • Fixation-Aligned Pupillary Response Averaging
    Jeff Klingner. Full paper.

    Posters:

  • Qualitative and Quantitative Scoring and Evaluation of the Eye Movement Classification Algorithms.
    Oleg Komogortsev, Sampath Jayarathna, Do Hyong Koh, and Sandeep Munikrishne Gowda
    . Poster.

  • Group-Wise Similarity and Classification of Aggregate Scanpaths
    Thomas Grindinger, Andrew Duchowski, and Michael Sawyer. Poster.

Visualization

Full papers:
  • Visual Scanpath Representation
    Joseph Goldberg and Jonathan Helfman. Full paper.

  • Scanpath Comparison Revisited
    Andrew Duchowski, Jason Driver, Sheriff Jolaoso, Beverly Ramey, Ami Robbins, and William Tan. Full paper.

  • Scanpath Clustering and Aggregation
    Joseph Goldberg and Jonathan Helfman. Full paper.

  • Space-Variant Spatio-Temporal Filtering of Video for Gaze Visualization and Perceptual Learning
    Michael Dorr, Halszka Jarodzka, and Erhardt Barth.
    Full paper.

    Posters:

  • Adapted Gaze Visualizations for Three-dimensional Virtual Environments
    Sophie Stellmach, Lennart Nacke, and Raimund Dachselt. Poster.

  • Visual Span and Other Parameters for the Generation of Heatmaps
    Pieter Blignaut. Poster.

Cognitive studies & HCI

Full papers:
  • Interpretation of Geometric Shapes - An Eye Movement Study
    Miquel Prats, Iestyn Jowers, Nieves Pedreira, Steve Garner, and Alison McKay. Full paper.

  • Understanding the Benefits of Gaze Enhanced Visual Search
    Pernilla Qvarfordt, Jacob Biehl, Gene Golovchinksy, and Tony Dunnigan. Full paper.

  • Image Ranking with Implicit Feedback from Eye Movements
    David Hardoon and Kitsuchart Pasupa. Full paper.

  • How the Interface Design Influences Users’ Spontaneous Trustworthiness Evaluations of Web Search Results: Comparing a List and a Grid Interface
    Yvonne Kammerer and Peter Gerjets. Full paper.

    Short papers:

  • Have You Seen Any of These Men? Looking at Whether Eyewitnesses Use Scanpaths to Recognize Suspects in Photo Lineups
    Sheree Josephson and Michael Holmes. Short paper.

  • Contingency Evaluation of Gaze-Contingent Displays for Real-Time Visual Field Simulations
    Margarita Vinnikov and Robert Allison. Short paper.

  • Estimation of Viewer's Response for Contextual Understanding of Tasks of Using Features of Eye-movements
    Minoru Nakayama and Yuko Hayashi. Short paper.
Posters:
  • Gaze-based Web Search: The Impact of Interface Design on Search Result Selection
    Yvonne Kammerer and Wolfgang Beinhauer. Poster

  • Visual Search in the (Un)Real World: How Head-Mounted Displays Affect Eye Movements, Head Movements and Target Detection
    Tobit Kollenberg, Alexander Neumann, Dorothe Schneider, Tessa-Karina Tews, Thomas Hermann, Helge Ritter, Angelika Dierker, and Hendrik Koesling. Poster.

  • Quantification of Aesthetic Viewing Using Eye-Tracking Technology: The Influence of Previous Training in Apparel Design
    Juyeon Park, Marilyn DeLong, and Emily Woods. Poster.

  • Visual Attention for Implicit Relevance Feedback in a Content Based Image Retrieval
    Concetto Spampinato, Alberto Faro, Daniela Giordano, and Carmelo Pino. Poster.

  • Eye and Pointer Coordination in Search and Selection Tasks
    Hans-Joachim Bieg, Lewis Chuang, Roland Fleming, Harald Reiterer, and Heinrich Bülthoff. Poster.

  • Natural Scene Statistics at Stereo Fixations
    Yang Liu, Lawrence Cormack, and Alan Bovik. Poster.

  • Measuring Situation Awareness of Surgeons in Laparoscopic Training
    Geoffrey Tien, Bin Zheng, Stella Atkins, and Colin Swindells

  • Saliency-Based Decision Support
    Roxanne Canosa. Poster.

  • Inferring Object Relevance from Gaze in Dynamic Scenes
    Melih Kandemir, Veli-Matti Saarinen, and Samuel Kaski. Poster.

  • Using Eye Tracking to Investigate Important Cues for Representative Creature Motion
    Meredith McLendon, Ann McNamara, Tim McLaughlin, and Ravindra Dwivedi

  • Estimating Cognitive Load Using Remote Eye Tracking in a Driving Simulator
    Oskar Palinko, Andrew Kun, Alexander Shyrokov, and Peter Heeman

Computer and machine control

Full papers:
  • Alternatives to Single Character Entry and Dwell Time Selection on Eye Typing
    Mario Urbina and Anke Huckauf.
    Full paper.

  • Designing Gaze Gestures for Gaming: an Investigation of Performance
    Howell Istance, Aulikki Hyrskykari, Lauri Immonen, Santtu Mansikkamaa, and Stephen Vickers. Full paper.

  • ceCursor, a Contextual Eye Cursor for General Pointing in Windows Environments
    Marco Porta, Alice Ravarelli, and Giovanni Spagnoli. Full paper.

  • BlinkWrite2: An Improved Text Entry Method Using Eye Blinks
    Behrooz Ashtiani and Scott MacKenzie
    . Full paper.

    Short papers:

  • Eye Movement as an Interaction Mechanism for Relevance Feedback in a Content-Based Image Retrieval System
    Yun Zhang, Hong FU, Zhen Liang, Zheru Chi, and Dagand Feng. Short paper.

  • Gaze Scribing in Physics Problem Solving
    David Rosengrant. Short paper.

  • Content-based Image Retrieval Using a Combination of Visual Features and Eye Tracking Data
    Zhen Liang, Hong FU, Yun Zhang, Zheru Chi, and Dagan Feng. Short paper.

  • Context Switching for Fast Key Selection in Text Entry Applications
    Carlos H. Morimoto and Arnon Amir. Short paper.

    Posters

  • Small-Target Selection with Gaze Alone
    Henrik Skovsgaard, Julio Mateo, John Flach, and John Paulin Hansen. Poster
  • What You See is Where You Go: Testing a Gaze-Driven Power Wheelchair for Individuals with Severe Multiple Disabilities
    Erik Wästlund, Kay Sponseller, and Ola Pettersson. Poster.

  • Single Gaze Gestures
    Emilie Møllenbach, Alastair Gale, Martin Lillholm, and John Paulin Hansen. Poster.

  • Using Vision and Voice to Create a Multimodal Interface for Microsoft Word 2007
    Tanya Beelders and Pieter Blignaut. Poster.

  • Towards Task-Independent Person Authentication Using Eye Movement Signals
    Tomi Kinnunen, Filip Sedlak, and Roman Bednarik

  • An Open Source Eye-gaze Interface: Expanding the Adoption of Eye-Gaze in Everyday Applications
    Craig Hennessey and Andrew Duchowski. Poster.

  • Pies with EYEs: The Limits of Hierarchical Pie Menus in Gaze Control
    Mario Urbina, Maike Lorenz, and Anke Huckauf. Poster.
  • Low-Latency Combined Eye and Head Tracking System for Teleoperating a Robotic Head in Real-Time
    Stefan Kohlbecher, Klaus Bartl, Stanislavs Bardins, and Erich Schneider. Poster.

Friday, January 8, 2010

Mobile Dias Eye Tracker

Remember the Dias Eyetracker that I wrote about last May? Today Diako Mardanbeigi, from Tehran in Iran, presents a new version of the Dias eye tracker that is low-cost, wireless and fully mobile. I'll let the video demonstration below speak for itself. Rumor has it that Dias has been in contact with the ITU GazeGroup for a potential continuation of his research. Time will tell.



"This is a low cost mobile eye tracker with a wireless and Light weight head mounted hardware. This system gathers eye movements and estimates the point of gaze during the performance of daily tasks. It can let you to assess the visual behavior of the person online and in real-time when he is doing a specific task. A mobile eye tracker has a wide variety of applications in several fields such as human factors, market research, consumer shopping behavior, sports, driving, reading, safety & training. "

Friday, December 11, 2009

PhD Defense: Off-the-Shelf Gaze Interaction

Javier San Agustin will defend his PhD thesis on "Off-the-Shelf Gaze Interaction" at the IT University of Copenhagen on the 8th of January from 13.00 to (at most) 17.00. The program for the event consists of a one hour presentation which is followed by a discussion with the committee, formed by Andrew Duchowski, Bjarne Kjær Ersbøll, and Arne John Glenstrup. Whereby a traditional reception with snacks and drinks will be held.

Update: The thesis is now available as PDF, 179 pages, 3.6MB.

Abstract of the thesis:


People with severe motor-skill disabilities are often unable to use standard input devices such as a mouse or a keyboard to control a computer and they are, therefore, in strong need for alternative input devices. Gaze tracking offers them the possibility to use the movements of their eyes to interact with a computer, thereby making them more independent. A big effort has been put toward improving the robustness and accuracy of the technology, and many commercial systems are nowadays available in the market.

Despite the great improvements that gaze tracking systems have undergone in the last years, high prices have prevented gaze interaction from becoming mainstream. The use of specialized hardware, such as industrial cameras or infrared light sources, increases the accuracy of the systems, but also the price, which prevents many potential users from having access to the technology. Furthermore, the different components are often required to be placed in specific locations, or are built into the monitor, thus decreasing the flexibility of the setup.

Gaze tracking systems built from low-cost and off-the-shelf components have the potential to facilitate access to the technology and bring the prices down. Such systems are often more flexible, as the components can be placed in different locations, but also less robust, due to the lack of control over the hardware setup and the lower quality of the components compared to commercial systems.

The work developed for this thesis deals with some of the challenges introduced by the use of low-cost and off-the-shelf components for gaze interaction. The main contributions are:
  • Development and performance evaluation of the ITU Gaze Tracker, an off-the-shelf gaze tracker that uses an inexpensive webcam or video camera to track the user's eye. The software is readily available as open source, offering the possibility to try out gaze interaction for a low price and to analyze, improve and extend the software by modifying the source code.
  • A novel gaze estimation method based on homographic mappings between planes. No knowledge about the hardware configuration is required, allowing for a flexible setup where camera and light sources can be placed at any location.
  • A novel algorithm to detect the type of movement that the eye is performing, i.e. fixation, saccade or smooth pursuit. The algorithm is based on eye velocity and movement pattern, and allows to smooth the signal appropriately for each kind of movement to remove jitter due to noise while maximizing responsiveness.

Tuesday, December 8, 2009

Scandinavian Workshop on Applied Eye-tracking (SWAET) 2010.

The first call for papers for the annual Scandinavian Workshop on Applied Eye-Tracking (SWAET) organized by Kenneth Holmqvist and the team at the Lund University Humanities laboratory was just announced. The SWAET 2010 will be held in Lund, Sweden between May 5-7th. The invited speaker is Gerry Altmann (blog) from the Dept. of Psychology at University of York, UK and Ignace Hooge (s1, s2) from the Dept. of Psychology at Utrecht University, Holland.

Visit the SWAET website for more information.

Update: Download the abstracts (pdf, 1Mb)

Tuesday, November 24, 2009

Remote tracker and 6DOF using a webcam

The following video clips demonstrates a Masters thesis project from the AGH University of Science and Technology in Cracow, Poland. The method developed provides 6 degrees of freedom head tracking and 2D eye tracking using a simple, low resolution 640x480 webcam. Under the hood it's based on the Lucas-Kanade optical flow and POSIT. A great start as the head tracking seems relatively stable. Imagine it with IR illumination, a camera with slightly higher resolution and a narrow angle lens. And of course, pupil + glint tracking algorithms for calibrated gaze estimation.


Monday, November 23, 2009

ITU GazeTracker in the wild

Came across these two Youtube videos from students out there using the ITU GazeTracker in their HCI projects. By now the software has been downloaded 3000 times and the forum has seen close to three hundred posts. It's been a good start, better yet, a new version is in the makings. It offers a complete network API for third party applications, improved tracking performance, better camera control and a number of bugfixes (thanks for your feedback). It will be released when it's ready.







Thanks for posting the videos!

Wednesday, October 21, 2009

Nokia near-eye display gaze interaction update

The Nokia near-eye gaze interaction platform that I tried in Finland last year has been further improved. The cap used to support the weight has been replaced with a sturdy frame and the overall prototype seems lighter and also incorporates headphones. The new gaze based navigation interface support photo browsing based on the Image Space application, allowing location based accesses to user generated content. See the concept video at the bottom for their futuristic concept. Nokia research website. The prototype will be displayed at the International Symposium on Mixed and Augmented Reality conference in Orlando, October 19-22.