Thursday, March 18, 2010

GM Automotive heads-up display

General Motors today presents a new automotive heads up display system that has been developed in conjunction with Carnegie Mellon and the University of Southern California. It employs a number of sensors that coupled with object and pattern recognition could assist the driver by projecting information directly onto the windshield. For example the system could assist in navigation by highlighting road signs and emphasis the lanes/edges of the road in difficult driving conditions (rain, snow, fog). Inside the car the system uses an eye tracking solution provided by Swedish firm Smart Eye. Their Smart Eye Pro 5.4 employs several cameras (three in the demonstration, max 6) and infrared illumination to provide 6 degrees of freedom head tracking and 2D eye tracking, both with a (reported) 0.5 degree accuracy. The firm reports that the system provides "immunity to difficult light conditions, including darkness and rapidly varying sunlight" however to what extent this is true for direct facial sunlight remains to be seen. However, over time technical issues are to be overcome, its exciting to see that eye tracking is considered for everyday applications (in a not so distant future). They are not the only ones working on this right now.




Sources:
GM Media "GM Reimagines Head-Up Display Technology"
Engadget.com "GM shows off sensor-laden windshield, new heads-up display prototype"
TG Daily "GM develops HUD system for vehicle windshields"

Tuesday, January 26, 2010

ETRA 2k10 Program Announced!

The awaited program for this years Eye Tracking Research and Applications symposium held in Austin, Texas, 22nd-24th March has been announced. The biennial get-together for leading research on eye movement research targeting computer scientists, engineers and behavioral researchers, is organized in conjunction with the European Communication By Gaze Interaction (COGAIN) association. Which emphasizes a certain focus on gaze-based interaction for individuals with physical motor control disabilities. This years keynote will be given by Scott MacKenzie, Associate Professor of Computer Science and Engineering at York University, Canada.

The long papers section contains 18 entries reflecting the various areas of eye gaze research, from eye tracking, data analysis, visualization, cognitive studies, and interaction & control. In addition, the long list of short papers and a full poster section will ensure a worthwhile event for anyone interested in eye movement related research.

Update: Official detailed program now available.
Update: The papers are now available online.

Looking forward to meeting you there!

Eye tracking & technical achievements

Full papers:
  • Homography Normalization for Robust Gaze Estimation in Uncalibrated Setups
    Dan Witzner Hansen, Javier San Agustin, and Arantxa Villanueva.
    Full paper.

  • Head-Mounted Eye-Tracking of Infants’ Natural Interactions: A New Method
    John Franchak, Kari Kretch, Kasey Soska, Jason Babcock, and Karen Adolph. Full paper.

  • User-Calibration-Free Remote Gaze Estimation System
    Dmitri Model and Moshe Eizenman. Full paper.
Short papers:
  • The Pupillometric Precision of a Remote Video Eye Tracker
    Jeff Klingner. Short paper.

  • Biometric Identification via an Oculomotor Plant Mathematical Model
    Oleg Komogortsev, Sampath Jayarathna, Cecilia Aragon, and Mechehoul Mahmoud. Short paper.

  • SemantiCode: Using Content Similarity and Database-driven Matching to Code Wearable Eyetracker Gaze Data
    Daniel Pontillo, Thomas Kinsman, and Jeff Pelz
    . Short paper.

  • Gaze Estimation Method based on an Aspherical Model of the Cornea: Surface of Revolution about the Optical Axis of the Eye
    Takashi Nagamatsu, Yukina Iwamoto, Junzo Kamahara, Naoki Tanaka, and Michiya Yamamoto. Short paper.

  • User-calibration-free Gaze Tracking with Estimation of the Horizontal Angles between the Visual and the Optical Axes of Both Eyes
    Takashi Nagamatsu, Ryuichi Sugano, Yukina Iwamoto, Junzo Kamahara, and Naoki Tanaka. Short paper.

    Posters:

  • Evaluation of a Low-Cost Open-Source Gaze Tracker
    John Hansen, Dan Witzner Hansen, Emilie Møllenbach, Martin Tall, Javier San Agustin, Maria Barrett, and Henrik Skovsgaard. Poster.

  • Measuring Vergence Over Stereoscopic Video with a Remote Eye Tracker
    Brian Daugherty, Andrew Duchowski, Donald House, and Celambarasan Ramasamy. Poster.

  • Learning Relevant Eye Movement Feature Spaces Across Users.
    Zakria Hussain, Kitsuchart Pasupa, and John Shawe-Taylor
    . Poster.

  • Interactive Interface for Remote Administration of Clinical Tests Based on Eye Tracking
    Alberto Faro, Daniela Giordano, Concetto Spampinato, Davide De Tommaso, and Simona Ullo. Poster.

  • Robust Optical Eye Detection During Head Movement
    Jeffrey Mulligan and Kevin Gabayan. Poster.

  • Estimating 3D Point-of-regard and Visualizing Gaze Trajectories under Natural Head Movements
    Kentaro Takemura, Yuji Kohashi, Tsuyoshi Suenaga, Jun Takamatsu, and Tsukasa Ogasawara. Poster.

  • Eye Tracking with the Adaptive Optics Scanning Laser Ophthalmoscope
    Scott Stevenson, Austin Roorda, and Girish Kumar. Poster.

  • A Depth Compensation Method for Cross-Ratio Based Eye Tracking
    Flavio L. Coutinho and Carlos H. Morimoto. Poster.

  • Pupil Center Detection in Low Resolution Images
    Detlev Droege and Dietrich Paulus. Poster.

  • Development of Eye-Tracking Pen Display Based on Stereo Bright Pupil Technique
    Michiya Yamamoto, Takashi Nagamatsu, and Tomio Watanabe. Poster.

  • The Use of Eye Tracking for PC Energy Management
    Vasily Moshnyaga

  • Listing's and Donders' Laws and the Estimation of the Point-of-Gaze
    Elias Guestrin and Moshe Eizenman

Data processing & eye movement detection

Full papers:
  • A Vector-Based, Multi-Dimensional Scanpath Similiarty Measure
    Halszka Jarodzka, Kenneth Holmqvist, and Marcus Nyström.
    Full paper.

  • Match-Moving for Area-Based Analysis of Eye Movements in Natural Tasks
    Andrew Duchowski, Wayne Ryan, Ellen Vincent, and Dina Battisto. Full paper.

  • Fixation-Aligned Pupillary Response Averaging
    Jeff Klingner. Full paper.

    Posters:

  • Qualitative and Quantitative Scoring and Evaluation of the Eye Movement Classification Algorithms.
    Oleg Komogortsev, Sampath Jayarathna, Do Hyong Koh, and Sandeep Munikrishne Gowda
    . Poster.

  • Group-Wise Similarity and Classification of Aggregate Scanpaths
    Thomas Grindinger, Andrew Duchowski, and Michael Sawyer. Poster.

Visualization

Full papers:
  • Visual Scanpath Representation
    Joseph Goldberg and Jonathan Helfman. Full paper.

  • Scanpath Comparison Revisited
    Andrew Duchowski, Jason Driver, Sheriff Jolaoso, Beverly Ramey, Ami Robbins, and William Tan. Full paper.

  • Scanpath Clustering and Aggregation
    Joseph Goldberg and Jonathan Helfman. Full paper.

  • Space-Variant Spatio-Temporal Filtering of Video for Gaze Visualization and Perceptual Learning
    Michael Dorr, Halszka Jarodzka, and Erhardt Barth.
    Full paper.

    Posters:

  • Adapted Gaze Visualizations for Three-dimensional Virtual Environments
    Sophie Stellmach, Lennart Nacke, and Raimund Dachselt. Poster.

  • Visual Span and Other Parameters for the Generation of Heatmaps
    Pieter Blignaut. Poster.

Cognitive studies & HCI

Full papers:
  • Interpretation of Geometric Shapes - An Eye Movement Study
    Miquel Prats, Iestyn Jowers, Nieves Pedreira, Steve Garner, and Alison McKay. Full paper.

  • Understanding the Benefits of Gaze Enhanced Visual Search
    Pernilla Qvarfordt, Jacob Biehl, Gene Golovchinksy, and Tony Dunnigan. Full paper.

  • Image Ranking with Implicit Feedback from Eye Movements
    David Hardoon and Kitsuchart Pasupa. Full paper.

  • How the Interface Design Influences Users’ Spontaneous Trustworthiness Evaluations of Web Search Results: Comparing a List and a Grid Interface
    Yvonne Kammerer and Peter Gerjets. Full paper.

    Short papers:

  • Have You Seen Any of These Men? Looking at Whether Eyewitnesses Use Scanpaths to Recognize Suspects in Photo Lineups
    Sheree Josephson and Michael Holmes. Short paper.

  • Contingency Evaluation of Gaze-Contingent Displays for Real-Time Visual Field Simulations
    Margarita Vinnikov and Robert Allison. Short paper.

  • Estimation of Viewer's Response for Contextual Understanding of Tasks of Using Features of Eye-movements
    Minoru Nakayama and Yuko Hayashi. Short paper.
Posters:
  • Gaze-based Web Search: The Impact of Interface Design on Search Result Selection
    Yvonne Kammerer and Wolfgang Beinhauer. Poster

  • Visual Search in the (Un)Real World: How Head-Mounted Displays Affect Eye Movements, Head Movements and Target Detection
    Tobit Kollenberg, Alexander Neumann, Dorothe Schneider, Tessa-Karina Tews, Thomas Hermann, Helge Ritter, Angelika Dierker, and Hendrik Koesling. Poster.

  • Quantification of Aesthetic Viewing Using Eye-Tracking Technology: The Influence of Previous Training in Apparel Design
    Juyeon Park, Marilyn DeLong, and Emily Woods. Poster.

  • Visual Attention for Implicit Relevance Feedback in a Content Based Image Retrieval
    Concetto Spampinato, Alberto Faro, Daniela Giordano, and Carmelo Pino. Poster.

  • Eye and Pointer Coordination in Search and Selection Tasks
    Hans-Joachim Bieg, Lewis Chuang, Roland Fleming, Harald Reiterer, and Heinrich Bülthoff. Poster.

  • Natural Scene Statistics at Stereo Fixations
    Yang Liu, Lawrence Cormack, and Alan Bovik. Poster.

  • Measuring Situation Awareness of Surgeons in Laparoscopic Training
    Geoffrey Tien, Bin Zheng, Stella Atkins, and Colin Swindells

  • Saliency-Based Decision Support
    Roxanne Canosa. Poster.

  • Inferring Object Relevance from Gaze in Dynamic Scenes
    Melih Kandemir, Veli-Matti Saarinen, and Samuel Kaski. Poster.

  • Using Eye Tracking to Investigate Important Cues for Representative Creature Motion
    Meredith McLendon, Ann McNamara, Tim McLaughlin, and Ravindra Dwivedi

  • Estimating Cognitive Load Using Remote Eye Tracking in a Driving Simulator
    Oskar Palinko, Andrew Kun, Alexander Shyrokov, and Peter Heeman

Computer and machine control

Full papers:
  • Alternatives to Single Character Entry and Dwell Time Selection on Eye Typing
    Mario Urbina and Anke Huckauf.
    Full paper.

  • Designing Gaze Gestures for Gaming: an Investigation of Performance
    Howell Istance, Aulikki Hyrskykari, Lauri Immonen, Santtu Mansikkamaa, and Stephen Vickers. Full paper.

  • ceCursor, a Contextual Eye Cursor for General Pointing in Windows Environments
    Marco Porta, Alice Ravarelli, and Giovanni Spagnoli. Full paper.

  • BlinkWrite2: An Improved Text Entry Method Using Eye Blinks
    Behrooz Ashtiani and Scott MacKenzie
    . Full paper.

    Short papers:

  • Eye Movement as an Interaction Mechanism for Relevance Feedback in a Content-Based Image Retrieval System
    Yun Zhang, Hong FU, Zhen Liang, Zheru Chi, and Dagand Feng. Short paper.

  • Gaze Scribing in Physics Problem Solving
    David Rosengrant. Short paper.

  • Content-based Image Retrieval Using a Combination of Visual Features and Eye Tracking Data
    Zhen Liang, Hong FU, Yun Zhang, Zheru Chi, and Dagan Feng. Short paper.

  • Context Switching for Fast Key Selection in Text Entry Applications
    Carlos H. Morimoto and Arnon Amir. Short paper.

    Posters

  • Small-Target Selection with Gaze Alone
    Henrik Skovsgaard, Julio Mateo, John Flach, and John Paulin Hansen. Poster
  • What You See is Where You Go: Testing a Gaze-Driven Power Wheelchair for Individuals with Severe Multiple Disabilities
    Erik Wästlund, Kay Sponseller, and Ola Pettersson. Poster.

  • Single Gaze Gestures
    Emilie Møllenbach, Alastair Gale, Martin Lillholm, and John Paulin Hansen. Poster.

  • Using Vision and Voice to Create a Multimodal Interface for Microsoft Word 2007
    Tanya Beelders and Pieter Blignaut. Poster.

  • Towards Task-Independent Person Authentication Using Eye Movement Signals
    Tomi Kinnunen, Filip Sedlak, and Roman Bednarik

  • An Open Source Eye-gaze Interface: Expanding the Adoption of Eye-Gaze in Everyday Applications
    Craig Hennessey and Andrew Duchowski. Poster.

  • Pies with EYEs: The Limits of Hierarchical Pie Menus in Gaze Control
    Mario Urbina, Maike Lorenz, and Anke Huckauf. Poster.
  • Low-Latency Combined Eye and Head Tracking System for Teleoperating a Robotic Head in Real-Time
    Stefan Kohlbecher, Klaus Bartl, Stanislavs Bardins, and Erich Schneider. Poster.

Friday, January 8, 2010

Mobile Dias Eye Tracker

Remember the Dias Eyetracker that I wrote about last May? Today Diako Mardanbeigi, from Tehran in Iran, presents a new version of the Dias eye tracker that is low-cost, wireless and fully mobile. I'll let the video demonstration below speak for itself. Rumor has it that Dias has been in contact with the ITU GazeGroup for a potential continuation of his research. Time will tell.



"This is a low cost mobile eye tracker with a wireless and Light weight head mounted hardware. This system gathers eye movements and estimates the point of gaze during the performance of daily tasks. It can let you to assess the visual behavior of the person online and in real-time when he is doing a specific task. A mobile eye tracker has a wide variety of applications in several fields such as human factors, market research, consumer shopping behavior, sports, driving, reading, safety & training. "

Friday, December 11, 2009

PhD Defense: Off-the-Shelf Gaze Interaction

Javier San Agustin will defend his PhD thesis on "Off-the-Shelf Gaze Interaction" at the IT University of Copenhagen on the 8th of January from 13.00 to (at most) 17.00. The program for the event consists of a one hour presentation which is followed by a discussion with the committee, formed by Andrew Duchowski, Bjarne Kjær Ersbøll, and Arne John Glenstrup. Whereby a traditional reception with snacks and drinks will be held.

Update: The thesis is now available as PDF, 179 pages, 3.6MB.

Abstract of the thesis:


People with severe motor-skill disabilities are often unable to use standard input devices such as a mouse or a keyboard to control a computer and they are, therefore, in strong need for alternative input devices. Gaze tracking offers them the possibility to use the movements of their eyes to interact with a computer, thereby making them more independent. A big effort has been put toward improving the robustness and accuracy of the technology, and many commercial systems are nowadays available in the market.

Despite the great improvements that gaze tracking systems have undergone in the last years, high prices have prevented gaze interaction from becoming mainstream. The use of specialized hardware, such as industrial cameras or infrared light sources, increases the accuracy of the systems, but also the price, which prevents many potential users from having access to the technology. Furthermore, the different components are often required to be placed in specific locations, or are built into the monitor, thus decreasing the flexibility of the setup.

Gaze tracking systems built from low-cost and off-the-shelf components have the potential to facilitate access to the technology and bring the prices down. Such systems are often more flexible, as the components can be placed in different locations, but also less robust, due to the lack of control over the hardware setup and the lower quality of the components compared to commercial systems.

The work developed for this thesis deals with some of the challenges introduced by the use of low-cost and off-the-shelf components for gaze interaction. The main contributions are:
  • Development and performance evaluation of the ITU Gaze Tracker, an off-the-shelf gaze tracker that uses an inexpensive webcam or video camera to track the user's eye. The software is readily available as open source, offering the possibility to try out gaze interaction for a low price and to analyze, improve and extend the software by modifying the source code.
  • A novel gaze estimation method based on homographic mappings between planes. No knowledge about the hardware configuration is required, allowing for a flexible setup where camera and light sources can be placed at any location.
  • A novel algorithm to detect the type of movement that the eye is performing, i.e. fixation, saccade or smooth pursuit. The algorithm is based on eye velocity and movement pattern, and allows to smooth the signal appropriately for each kind of movement to remove jitter due to noise while maximizing responsiveness.

Tuesday, December 8, 2009

Scandinavian Workshop on Applied Eye-tracking (SWAET) 2010.

The first call for papers for the annual Scandinavian Workshop on Applied Eye-Tracking (SWAET) organized by Kenneth Holmqvist and the team at the Lund University Humanities laboratory was just announced. The SWAET 2010 will be held in Lund, Sweden between May 5-7th. The invited speaker is Gerry Altmann (blog) from the Dept. of Psychology at University of York, UK and Ignace Hooge (s1, s2) from the Dept. of Psychology at Utrecht University, Holland.

Visit the SWAET website for more information.

Update: Download the abstracts (pdf, 1Mb)

Tuesday, November 24, 2009

Remote tracker and 6DOF using a webcam

The following video clips demonstrates a Masters thesis project from the AGH University of Science and Technology in Cracow, Poland. The method developed provides 6 degrees of freedom head tracking and 2D eye tracking using a simple, low resolution 640x480 webcam. Under the hood it's based on the Lucas-Kanade optical flow and POSIT. A great start as the head tracking seems relatively stable. Imagine it with IR illumination, a camera with slightly higher resolution and a narrow angle lens. And of course, pupil + glint tracking algorithms for calibrated gaze estimation.


Monday, November 23, 2009

ITU GazeTracker in the wild

Came across these two Youtube videos from students out there using the ITU GazeTracker in their HCI projects. By now the software has been downloaded 3000 times and the forum has seen close to three hundred posts. It's been a good start, better yet, a new version is in the makings. It offers a complete network API for third party applications, improved tracking performance, better camera control and a number of bugfixes (thanks for your feedback). It will be released when it's ready.







Thanks for posting the videos!

Wednesday, October 21, 2009

Nokia near-eye display gaze interaction update

The Nokia near-eye gaze interaction platform that I tried in Finland last year has been further improved. The cap used to support the weight has been replaced with a sturdy frame and the overall prototype seems lighter and also incorporates headphones. The new gaze based navigation interface support photo browsing based on the Image Space application, allowing location based accesses to user generated content. See the concept video at the bottom for their futuristic concept. Nokia research website. The prototype will be displayed at the International Symposium on Mixed and Augmented Reality conference in Orlando, October 19-22.






Medical Image Perception Society 2009 - Day three

Session 10. Displays and Tools. Chair: Kevin Berbaum, PhD
  • Objective methodology to compare clinical value of computed tomography artifact reduction algorithms. G Spalla, C Marchessoux, M Vaz, A Ricker, & T Kimpe
  • LCD Spatial Noise Suppression: Large-field vs. ROI Image Processing. WJ Dallas, H Roehrig, J Fan, EA Krupinski, & J Johnson
Session 11. Displays and Tools. Chair: Miguel Eckstein, PhD
  • Stereoscopic Digital mammography: Improved Accuracy of Lesion Detection in Breast Cancer Screening. DJ Getty, CJ D’Orsi, & RM Pickett
  • Detectability in tomosynthesis projections, slices and volumes: Comparison of human observer performance in a SKE detection task. I Reiser, K Little, & RM Nishikawa
Thanks Craig, Miguel and Elisabeth for a wonderful event, learned so much in just three days. Plenty of inspiration for future research.

Medical Image Perception Society 2009 - Day two

Session 6. Performance Measurement II. Chair: Matthew Freedman, MD, MBA
  • Coding of FDG Intensity as a 3-D Rendered Height Mapping to Improve Fusion Display of Co-Registered PET-CT Images. RM Shah, C Wood, YP Hu, & LS Zuckier
  • Estimation of AUC from Normally Distributed Rating Data with Known Variance Ratio. A Wunderlich & F Noo
  • Using the Mean-to-Variance Ratio as a Diagnostic for Unacceptably Improper Binormal ROC Curves. SL Hillis & KS Berbaum
Session 7. Performance Measurement II. Chair: Stephen Hillis, PhD
  • BI-RADS Data Should Not be Used to Estimate ROC Curves. Y Jiang & CE Metz

  • Estimating the utility of screening mammography in large clinical studies. CK Abbey, JM Boone, & MP Eckstein

  • Issues Related to the Definition of Image Contrast, DL Leong & PC Brennan
Session 8. Models of Perceptual processing. Chair: Yulei Jiang, PhD
  • Channelized Hotelling Observers for Detection Tasks in Multi-Slice Images. L Platiša, B Goossens, E Vansteenkiste, A Badano & W Philips

  • Channelized Hotelling observers adapted to irregular signals in breast tomosynthesis detection tasks. I Diaz, P Timberg, CK Abbey, MP Eckstein, FR Verdun, C Castella, FO Bochud

  • Detecting Compression Artifacts in Virtual Pathology Images Using a Visual Discrimination Model. J Johnson & EA Krupinski

  • Automatic MRI Acquisition Parameters Optimization Using HVS-Based Maps. J Jacobsen, P Irarrázabal, & C Tejos

  • Parametric Assessment of Lesion Detection Using a Pre-whitened Matched Filter on Projected Breast CT Images. N Packard, CK Abbey, & JM Boone

  • Model Observers for Complex Discrimination Tasks: Deployment Assessment of Multiple Coronary Stents. S Zhang, CK Abbey, X Da, JS Whiting, & MP Eckstein
Session 9. Special Invited Session on Neuroscience and Medical Image Perception. Chair: Miguel Eckstein, PhD
  • Decoding Information Processing When Attention Fails: An Electrophysiological Approach. B Giesbrecht
  • Some Neural Bases of Radiological Expertise. SA Engel