Wednesday, December 22, 2010

Santa's been spotted - Introducing the SMI Glasses

What a year it has been in the commercial eye tracking domain. In June we had the Tobii glasses which was their entry into the head-mounted market which created some buzz online. This was followed by a high-speed remote system, the Tobii TX300, which was introduced in November. Both products competed directly with the offering from SMI which countered with the RED500 remote tracker, surpassing the Tobii system by 200 samples per second. Today it's my pleasure to introduce the SMI Glasses which brings up the competition a couple of notches. Being comparable in the neat, unobtrusive form factor they provide binocular tracking with a direct view of both eyes.
Rendered image of the upcoming SMI Glasses.
The small scene camera is located in the center of glasses which gives minimal parallax. Although the hard specs has yet to be released it is rumored to have a high resolution scene camera, long battery lifetime and an advanced IR AOA marker detection system which enables automatic mapping of gaze data to real-world objects. Furthermore, they can be used not only as blackbox system – but may be integrated with SMIs current head mounted devices, including live view, open interface for co-registration etc. Estimated availability is projected to the first half of 2011.

Thanks for all the hard work, inspiration and feedback throughout 2010, it's been an amazing year. By the looks of it 2011 appears to be a really interesting year for eye tracking. I'd like to wish everyone a Merry Christmas and a Happy New Year.

Tuesday, December 14, 2010

Method for Automatic Mapping of Eye Tracker Data to Hypermedia Content

Came across the United States Patent Application 20100295774 which has been filed by Craig Hennessey of Mirametrix. Essentially the system creates Regions Of Interest based on the HTML code (div-tags) to do an automatic mapping between gaze X&Y and the location of elements. This is done by accessing the Microsoft Document Object Model of an Intenet Explorer browser page to establish the "content tracker", a piece of software that generates the list of areas, their sizes and location on-screen which then are tagged with keywords (e.g logo, ad etc) This software will also keep track of several browser windows, their position and interaction state. 
"A system for automatic mapping of eye-gaze data to hypermedia content utilizes high-level content-of-interest tags to identify regions of content-of-interest in hypermedia pages. User's computers are equipped with eye-gaze tracker equipment that is capable of determining the user's point-of-gaze on a displayed hypermedia page. A content tracker identifies the location of the content using the content-of-interest tags and a point-of-gaze to content-of-interest linker directly maps the user's point-of-gaze to the displayed content-of-interest. A visible-browser-identifier determines which browser window is being displayed and identifies which portions of the page are being displayed. Test data from plural users viewing test pages is collected, analyzed and reported."
To conclude the idea is to have multiple clients equipped with eye trackers that communicates with a server. The central machine coordinates studies and stores the gaze data from each session (in the cloud?). Overall a strategy that makes perfect sense if your differentiating factor is low-cost. 

Monday, November 15, 2010


Just days after the Tobii TX300 was launched SMI counters with the introduction of the the worlds first 500Hz remote binocular eye tracker. SMI seriously ramps up the competition in the high speed remote systems, surpassing the Tobii TX by a hefty 200Hz. The RED500 has a operating distance of 60-80cm with a 40x40 trackbox at 70cm with a reported accuracy of <0.4 degrees under typical (optimal?) settings. Real-world performance evaluation by independent third party remains to be seen. Not resting on their laurels SMI regains the king-of-the-hill position with an impressive achievement that demonstrates how competitive the field has become. See the technical specs for more information.

Exploring the potential of context-sensitive CADe in screening mammography (Tourassi et al, 2010)

Georgia D. Tourassi, Maciej A. Mazurowski, and Brian P. Harrawood at Duke University Ravin Advanced Imaging Laboratories in collaboration with Elizabeth A. Krupinski presents a novel method of combining eye gaze data with Computer-Assisted Detection algorithms to improve detection rates for malignant masses in mammography. This contextualized method holds a potential for personalized diagnostic support.

Purpose: Conventional computer-assisted detection CADe systems in screening mammography provide the same decision support to all users. The aim of this study was to investigate the potential of a context-sensitive CADe system which provides decision support guided by each user’s focus of attention during visual search and reporting patterns for a specific case.

Methods: An observer study for the detection of malignant masses in screening mammograms was conducted in which six radiologists evaluated 20 mammograms while wearing an eye-tracking device. Eye-position data and diagnostic decisions were collected for each radiologist and case they reviewed. These cases were subsequently analyzed with an in-house knowledge-based CADe system using two different modes:  conventional mode with a globally fixed decision threshold and context-sensitive mode with a location-variable decision threshold based on the radiologists’ eye dwelling data and reporting information.

Results: The CADe system operating in conventional mode had 85.7% per-image malignant mass sensitivity at 3.15 false positives per image FPsI . The same system operating in context-sensitive mode provided personalized decision support at 85.7%–100% sensitivity and 0.35–0.40 FPsI to all six radiologists. Furthermore, context-sensitive CADe system could improve the radiologists’ sensitivity and reduce their performance gap more effectively than conventional CADe.

Conclusions: Context-sensitive CADe support shows promise in delineating and reducing the radiologists’ perceptual and cognitive errors in the diagnostic interpretation of screening mammograms more effectively than conventional CADe.
  • G. D. Tourassi, M. A. Mazurowski, B. P. Harrawood and E. A. Krupinski, "Exploring the potential of context-sensitive CADe in screening mammography," Medical Physics 37, 5728-5736. Online, PDF

Monday, November 8, 2010

GazeCom and SMI demonstrates automotive guidance system

"In order to determine the effectiveness of gaze guidance, within the project, SMI developed an experimental driving simulator with integrated eye tracking technology.  A driving safety study in a city was set up and testing in that environment has shown that the number of accidents was significantly lower with gaze guidance than without, while most of the drivers didn’t consciously notice the guiding visual cues."
Christian Villwock, Director for Eye and Gaze Tracking Systems at SMI: “We have shown that visual performance can significantly be improved by gaze contingent gaze guidance. This introduces huge potential in applications where expert knowledge has to be transferred or safety is critical, for example for radiological image analysis.” 
Within the GazeCom project, funded by the EU within the Future and Emerging Technologies (FET) program, the impact of gaze guidance on what is perceived and communicated effectively has been determined in a broad range of tasks of varying complexity. This included basic research in the understanding of visual perception and brain function up to the level where the guidance of gaze becomes feasible." (source)

Wednesday, November 3, 2010

SMI Releases iViewX 2.5

Today SensoMotoric Instruments released a new version of their iViewX software which offers a number of fixes and software improvements. Download here.

- NEW device: MEG 250
- RED5: improved tracking stability
- RED: improved pupil diameter calculation
- RED: improved distance measurement
- RED: improved 2 and 5 point calibration model
- file transfer server is installed with iView X now
- added configurable parallel port address

- RED5 camera drop outs in 60Hz mode on Clevo Laptop
- initializes LPT_IO and PIODIO on startup correctly
- RED standalone mode can be used with all calibration methods via remote commands
- lateral offset in RED5 head position visualization
- HED: Use TimeStamp in [ms] as Scene Video Overlay
- improved rejection parameters for NNL Devices
- crash when using ET_CAL in standalone mode
- strange behaviour with ET_REM and eT_REM. Look up in command list is now case-insensitive.
- RED5: Default speed is 60Hz for RED and 250Hz for RED250
- and many more small fixes and improvements

Tuesday, November 2, 2010

Optimization and Dynamic Simulation of a Parallel Three Degree-of-Freedom Camera Orientation System (T. Villgrattner, 2010)

Moving a camera 2500 degrees per second is such an awesome accomplishment that I cannot help myself, shamelessly long quote from IEEE Spectrum:

German researchers have developed a robotic camera that mimics the motion of real eyes and even moves at superhuman speeds. The camera system can point in any direction and is also capable of imitating the fastest human eye movements, which can reach speeds of 500 degrees per second. But the system can also move faster than that, achieving more than 2500 degrees per second. It would make for very fast robot eyes. Led by Professor Heinz Ulbrich at the Institute of Applied Mechanics at theTechnische Universität München, a team of researchers has been working on superfast camera orientation systems that can reproduce the human gaze.

In many experiments in psychology, human-computer interaction, and other fields, researchers want to monitor precisely what subjects are looking at. Gaze can reveal not only what people are focusing their attention on but it also provides clues about their state of mind and intentions. Mobile systems to monitor gaze include eye-tracking software and head-mounted cameras. But they're not perfect; sometimes they just can't follow a person's fast eye movements, and sometimes they provide ambiguous gaze information.

In collaboration with their project partners from the Chair for Clinical Neuroscience, Ludwig-Maximilians Universität MünchenDr. Erich Schneider, and Professor Thomas Brand the Munich team, which is supported in part by the CoTeSys Cluster, is developing a system to overcome those limitations. The system, propped on a person's head, uses a custom made eye-tracker to monitor the person's eye movements. It then precisely reproduces those movements using a superfast actuator-driven mechanism with yaw, pitch, and roll rotation, like a human eyeball. When the real eye move, the robot eye follows suit.

The engineers at the Institute of Applied Mechanics have been working on the camera orientation system over the past few years. Their previous designs had 2 degrees of freedom (DOF). Now researcher Thomas Villgrattner is presenting a system that improves on the earlier versions and features not 2 but 3 DOF. He explains that existing camera-orientation systems with 3 DOF  that are fast and lightweight rely on model aircraft servo actuators. The main drawback of such actuators is that they can introduce delays and require gear boxes.

So Villgrattner sought a different approach. Because this is a head-mounted device, it has to be lightweight and inconspicuous -- you don't want it rattling and shaking on the subject's scalp. Which actuators to use? The solution consists of an elegant parallel system that uses ultrasonic piezo actuators. The piezos transmit their movement to a prismatic joint, which in turns drives small push rods attached to the camera frame. The rods have spherical joints on either end, and this kind of mechanism is known as a PSS, or prismatic, spherical, spherical, chain. It's a "quite nice mechanism," says Masaaki Kumagai, a mechanical engineering associate professor at Tohoku Gakuin University, in Miyagi, Japan, who was not involved in the project. "I can't believe they made such a high speed/acceleration mechanism using piezo actuators."

The advantage is that it can reach high speeds and accelerations with small actuators, which remain on a stationary base, so they don't add to the inertial mass of the moving parts. And the piezos also provide high forces at low speeds, so no gear box is needed. Villgrattner describes the device's mechanical design and kinematics and dynamics analysis in a paper titled "Optimization and Dynamic Simulation of a Parallel Three Degree-of-Freedom Camera Orientation System," presented at last month's IEEE/RSJ International Conference on Intelligent Robots and Systems.

The current prototype weighs in at just 100 grams. It was able to reproduce the fastest eye movements, known as saccades, and also perform movements much faster than what our eyes can do.  The system, Villgrattner tells me, was mainly designed for a "head-mounted gaze-driven camera system," but he adds that it could also be used "for remote eye trackers, for eye related 'Wizard of Oz' tests, and as artificial eyes for humanoid robots." In particular, this last application -- eyes for humanoid robots -- appears quite promising, and the Munich team is already working on that. Current humanoid eyes are rather simple, typically just static cameras, and that's understandable given all the complexity in these machines. It would be cool to see robots with humanlike -- or super human -- gaze capabilities.

Below is a video of the camera-orientation system (the head-mount device is not shown). First, it moves the camera in all three single axes (vertical, horizontal, and longitudinal) with an amplitude of about 30 degrees. Next it moves simultaneously around all three axes with an amplitude of about 19 degrees. Then it performs fast movements around the vertical axis at 1000 degrees/second and also high dynamic movements around all axes. Finally, the system reproduces natural human eye movements based on data from an eye-tracking system." (source)

Monday, November 1, 2010

ScanMatch: A novel method for comparing fixation sequences (Cristino et al, 2010)

Using algorithms designed to compare DNA sequence in eye movement comparison. Radical, with MATLAB source code, fantastic! Appears to be noise tolerant and outperform traditional Levenshtein-distance.

We present a novel approach to comparing saccadic eye movement sequences based on the Needleman–Wunsch algorithm used in bioinformatics to compare DNA sequences. In the proposed method, the saccade sequence is spatially and temporally binned and then recoded to create a sequence of letters that retains fixation location, time, and order information. The comparison of two letter sequences is made by maximizing the similarity score computed from a substitution matrix that provides the score for all letter pair substitutions and a penalty gap. The substitution matrix provides a meaningful link between each location coded by the individual letters. This link could be distance but could also encode any useful dimension, including perceptual or semantic space. We show, by using synthetic and behavioral data, the benefits of this method over existing methods. The ScanMatch toolbox for MATLAB is freely available online (
  • Filipe Cristino, Sebastiaan Mathôt, Jan Theeuwes, and Iain D. Gilchrist
    ScanMatch: A novel method for comparing fixation sequences
    Behav Res Methods 2010 42:692-700; doi:10.3758/BRM.42.3.692
    Abstract   Full Text (PDF)   References

An improved algorithm for automatic detection of saccades in eye movement data and for calculating saccade parameters (Behrens et al, 2010)

"This analysis of time series of eye movements is a saccade-detection algorithm that is based on an earlier algorithm. It achieves substantial improvements by using an adaptive-threshold model instead of fixed thresholds and using the eye-movement acceleration signal. This has four advantages: (1) Adaptive thresholds are calculated automatically from the preceding acceleration data for detecting the beginning of a saccade, and thresholds are modified during the saccade. (2) The monotonicity of the position signal during the saccade, together with the acceleration with respect to the thresholds, is used to reliably determine the end of the saccade. (3) This allows differentiation between saccades following the main-sequence and non-main-sequence saccades. (4) Artifacts of various kinds can be detected and eliminated. The algorithm is demonstrated by applying it to human eye movement data (obtained by EOG) recorded during driving a car. A second demonstration of the algorithm detects microsleep episodes in eye movement data."

  • F. Behrens, M. MacKeben, and W. Schröder-Preikschat
    An improved algorithm for automatic detection of saccades in eye movement data and for calculating saccade parameters. Behav Res Methods 2010 42:701-708; doi:10.3758/BRM.42.3.701
    Abstract  Full Text (PDF  References

Thursday, October 28, 2010

Gaze Tracker 2.0 Preview

On my 32nd birthday I'd like to celebrate by sharing this video highlighting some of the features in the latest version of the GT2.0 that I've been working on with Javier San Agustin and the GT forum. Open source eye tracking have never looked better. Enjoy!

HD video available (click 360p and select 720p)

Friday, October 1, 2010

Tuesday, August 17, 2010

How to build low cost eye tracking glasses for head mounted system (M. Kowalik, 2010)

Michał Kowalik of the Faculty of Computer Science and Information Technology at the West Pomeranian University of Technology in Szczecin, Poland, has put together a great DIY instruction for a headmounted system using the ITU Gaze Tracker. The camera of choice is the Microsoft LifeCam VX-1000 which has been modified by removing the casing and IR filter. In addition, three IR LEDs illuminate the eye using power from the USB cabel. This is then mounted on a pair of safety glasses, just like Jason Babcock & Jeff Pelz previously have done. Total cost of the hardware less than 50€. Neat. Thanks Michal.

Download instructions as PDF (8.1Mb)

    Monday, August 16, 2010

    Call for Papers: ACM Transactions Special Issue on Eye Gaze

    ACM Transactions on Interactive Intelligent Systems
    Special Issue on Eye Gaze in Intelligent Human-Machine Interaction

    Aims and Scope

    Partly because of the increasing availability of nonintrusive and high-performance eye tracking devices, recent years have seen a growing interest in incorporating human eye gaze in intelligent user interfaces. Eye gaze has been used as a pointing mechanism in direct manipulation interfaces, for example, to assist users with “locked-in syndrome”. It has also been used as a reflection of information needs in web search and as a basis for tailoring information presentation. Detection of joint attention as indicated by eye gaze has been used to facilitate computer-supported human-human communication. In conversational interfaces, eye gaze has been used to improve language understanding and intention recognition. On the output side, eye gaze has been incorporated into the multimodal behavior of embodied conversational agents. Recent work on human-robot interaction has explored eye gaze in incremental language processing, visual scene processing, and conversation engagement and grounding.

    This special issue will report on state-of-the-art computational models, systems, and studies that concern eye gaze in intelligent and natural human-machine communication. The nonexhaustive list of topics below indicates the range of appropriate topics; in case of doubt, please contact the guest editors. Papers that focus mainly on eye tracking hardware and software as such will be relevant (only) if they make it clear how the advances reported open up new possibilities for the use of eye gaze in at least one of the ways listed above.


    • Empirical studies of eye gaze in human-human communication that provide new insight into the role of eye gaze and suggest implications for the use of eye gaze in intelligent systems. Examples include new empirical findings concerning eye gaze in human language processing, in human-vision processing, and in conversation management.
    • Algorithms and systems that incorporate eye gaze for human-computer interaction and human-robot interaction. Examples include gaze-based feedback to information systems; gaze-based attention modeling; exploiting gaze in automated language processing; and controlling the gaze behavior of embodied conversational agents or robots to enable grounding, turn-taking, and engagement.
    • Applications that demonstrate the value of incorporating eye gaze in practical systems to enable intelligent human-machine communication.

    Guest Editors

    • Elisabeth André, University of Augsburg, Germany (contact: andre[at]informatik[dot]
    • Joyce Chai, Michigan State University, USA

    Important Dates

    • By December 15th, 2010: Submission of manuscripts
    • By March 23rd, 2011: Notification about decisions on initial submissions
    • By June 23rd, 2011: Submission of revised manuscripts
    • By August 25th, 2011: Notification about decisions on revised manuscripts
    • By September 15th, 2011: Submission of manuscripts with final minor changes
    • Starting October, 2011: Publication of the special issue on the TiiS website and subsequently in the ACM Digital Library and as a printed issue

    Tuesday, August 10, 2010

    Eye control for PTZ cameras in video surveillance

    Bartosz Kunka, a PhD student at the Gdańsk University of Technology have employed a remote gaze-tracking system called Cyber-Eye to control PTZ cameras in video surveillance and video-conference systems. The movie prepared for system presentation on Research Challange at SIGGRAPH 2010 in Los Angeles.

    Wednesday, August 4, 2010

    EOG used to play Super Mario

    Came across some fun work by Waterloo labs that demos how to use a bunch of electrodes and a custom processing board to do signal analysis and estimate eye movement gestures though measuring EOG. It means you'll have to glance at the roof or floor to issue commands (no gaze point-of-regard estimation). Good thing is that the technology doesn't suffer from issues with light, optics and sensors that often makes video based eye tracking and gaze point-of-regard estimation complex. Bad thing is that it requires custom hardware, mounting of electrodes and wires, besides that the interaction style appears to involve looking away from what you are really interested in.

    Sunday, July 18, 2010

    Monday, June 28, 2010

    Video-games can be beneficial!

    Appears video-games can be beneficial your your eyes despite what mother said. Came across this article in the British Daily Mail, found it inspiring and believe it could be done even better with an interactive application using real-time gaze tracking input. Direct quote:

    "A six-year-old boy who nearly went blind in one eye can now see again after he was told to play on a Nintendo games console. Ben Michaels suffered from amblyopia, or severe lazy eye syndrome in his right eye from the age of four. His vision had decreased gradually in one eye and without treatment his sight loss could have become permanent. His GP referred him to consultant Ken Nischal who prescribed the unusual daily therapy. Ben, from Billericay, Essex, spends two hours a day playing Mario Kart on a Nintendo DS with his twin Jake. Ben wears a patch over his good eye to make his lazy one work harder. The twins' mother, Maxine, 36, said that from being 'nearly blind' in the eye, Ben's vision had 'improved 250 per cent' in the first week. She said: 'When he started he could not identify our faces with his weak eye.  Now he can read with it although he is still a way off where he ought to be. 'He was very cooperative with the patch, it had phenomenal effect and we’re very pleased.' Mr Nischal of Great Ormond Street Children's Hospital, said the therapy helped children with weak eyesight because computer games encourage repetitive eye movement, which trains the eye to focus correctly. 'A games console is something children can relate to. It allows us to deliver treatment quicker,' he said. 'What we don’t know is whether improvement is solely because of improved compliance, ie the child sticks with the patch more, or whether there is a physiological improvement from perceptual visual learning.' The consultant added that thousands of youngsters and adults could benefit from a similar treatment." (source)

    Tuesday, June 15, 2010

    Speech Dasher: Fast Writing using Speech and Gaze (K. Vertanen & D. MacKay, 2010)

    A new version of the Dasher typing interface utilizes speech recognition provided by the CMU PocketSphinx software doubles the typing performance measured in words per minute. From a previous 20 WPM to 40 WPM, close to what a professional keyboard jockey may produce.

    Speech Dasher allows writing using a combination of speech and a zooming interface. Users first speak what they want to write and then they navigate through the space of recognition hypotheses to correct any errors. Speech Dasher’s model combines information from a speech recognizer, from the
    user, and from a letter-based language model. This allows fast writing of anything predicted by the recognizer while also providing seamless fallback to letter-by-letter spelling for words not in the recognizer’s predictions. In a formative user study, expert users wrote at 40 (corrected) words per
    minute. They did this despite a recognition word error rate of 22%. Furthermore, they did this using only speech and the direction of their gaze (obtained via an eye tracker).

    • Speech Dasher: Fast Writing using Speech and Gaze
      Keith Vertanen and David J.C. MacKay. CHI '10: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, To appear. [Abstract+videos, PDF, BibTeX]

    Wednesday, May 26, 2010

    Abstracts from SWAET 2010

    The booklet containing the abstracts for the Scandinavian Workshop on Applied Eye Tracking (SWAET) is now available for download, 55 pages about 1Mb. The abstracts spans a wide range from gaze interaction to behavior and perception. A short one page format makes it attractive to venture into a multitude of domains and acts as a nice little starting point for digging deeper. Shame I couldn't attend, maybe next year. Kudos for making this booklet available.

     Title Authors
     Eye movements during mental imagery are not perceptual re-enactments R. Johansson, J. Holsanova, K. Holmqvist
     Practice eliminates "looking at nothing" A. Scholz, K. Mehlhorn, J.F. Krems
     Learning Perceptual Skills for Medical Diagnosis via Eye Movement  Modeling Examples on Patient Video Cases H. Jarodzka, T. Balslev, K. Holmqvist, K. Scheiter, M. Nyström, P. Gerjets, B. Eika
     Objective, subjective, and commercial information: The impact of presentation format on the visual inspection and selection of Web search results Y. Kammerer, P. Gerjets
     Eye Movements and levels of attention: A stimulus driven approach F.B. Mulvey, K. Holmqvist, J.P Hansen
     Player‟s gaze in a collaborative Tetris game P Jermann, M-A Nüssli, W. Li
     Naming associated objects: Evidence for parallel processing L. Mortensen , A.S. Meyer
     Reading Text Messages - An Eye-Tracking Study on the Influence of Shortening Strategies on Reading Comprehension V. Heyer, H. Hopp
     Eye movement measures to study the online comprehension of long (illustrated) texts J. Hyönä, J.K, Kaakinen
     Self-directed Learning Skills in Air-traffic Control; A Cued Retrospective Reporting Study L.W. van Meeuwen, S. Brand-Gruwel, J.J. G. van Merriënboer, J. J.P.R. de Bock, P.A. Kirschner
     Drivers‟ characteristic sequences of eye and head movements in intersections A. Bjelkemyr, K. Smith
     Comparing the value of different cues when using the retrospective think aloud method in web usability testing with eye tracking A. Olsen
     Gaze behavior and instruction sensitivity of Children with Autism Spectrum Disorders when viewing pictures of social scenes B. Rudsengen, F. Volden
     Impact of cognitive workload on gaze-including interaction S. Trösterer, J. Dzaack
     Interaction with mainstream interfaces using gaze alone H. Skovsgaard, J. P. Hansen, J.C. Mateo
     Stereoscopic Eye Movement Tracking: Challenges and Opportunities in 3D G. Öqvist Seimyr, A. Appelholm, H. Johansson R. Brautaset
     Sampling frequency – what speed do I need? R. Andersson, M. Nyström, K. Holmqvist
     Effect of head-distance on raw gaze velocity M-A Nüssli, P. Jermann
     Quantifying and modelling factors that influence calibration and data quality M. Nyström, R. Andersson,  J. van de Weijer

    Monday, May 24, 2010

    EyePhone - Mobil gaze interaction from University of Dartmouth

    From the Emiliano Miluzzo and the group at Sensorlab, part of the Computer Science department at University of Dartmouth, comes the EyePhone which enables rudimentary gaze based interaction for tablet computers. Contemporary devices often utilizes touch based interaction, this creates a problem with occlusion where the hands covers large parts of the display. EyePhone could help to alleviate this issue. The prototype system demonstrated offers enough accuracy for an interfaces based on a 3x3 grid layout but with better hardware and algorithms there is little reason why this couldn't be better. However, a major issue with a mobile system is just the mobility of both the user and the hardware, in practice this means that not only the individual head moments has to be compensated for but also movements of the camera in essentially all degrees of freedom. Not an easy thing to solve but it's not a question of "if" but "when". Perhaps there is something that could be done using the angular position sensors many mobile devices already have embedded. This is an excellent first step and with a thrilling potential. Additional information is available in the M.I.T Technology Review article.

    As smartphones evolve researchers are studying new techniques to ease the human-mobile interaction. We propose EyePhone, a novel "hands free" interfacing system capable of driving mobile applications/functions using only the user's eyes movement and actions (e.g., wink). EyePhone tracks the user's eye movement across the phone's display using the camera mounted on the front of the phone; more speci cally, machine learning algorithms are used to: i) track the eye and infer its position on the mobile phone display as a user views a particular application; and ii) detect eye blinks that emulate mouse clicks to activate the target application under view. We present a prototype implementation of EyePhone on a Nokia 810, which is capable of tracking the position of the eye on the display, mapping this positions to a function that is activated by a wink. At no time does the user have to physically touch the phone display.

    Figures. Camera images, eye region of interests and reported accuracies. Click to enlarge.

    • Emiliano Miluzzo, Tianyu Wang, Andrew T. Campbell, EyePhone: Activating Mobile Phones With Your Eyes. To appear in Proc. of The Second ACM SIGCOMM Workshop on Networking, Systems, and Applications on Mobile Handhelds (MobiHeld'10), New Delhi, India, August 30, 2010. [pdf] [video]

    Thursday, May 20, 2010

    Magnetic Eye Tracking Device from Arizona State University

    A group of students at the Arizona State University have revisited the scleral search coil to develop a new low-cost Magnetic Eye Tracking Device (METD). The entrepreneurs aim at making this technology available to the public at an affordable $4000 and are primarily targeting disabled. More information is available at ASU News.

    If your new to eye tracking it should be noted that the reporter claiming that common video based systems uses infrared lasers is just silly. It's essentially light-sources working in the IR spectrum (similar to the LED in your remote control).

    Friday, April 30, 2010

    GazePad: Low-cost remote webcam eye tracking

    Came across the GazeLib low-cost remote eye tracking project today which uses ordinary webcams without IR illumination. The accuracy is pretty low but it's really nice to see another low-cost approach for assistive technology.

    "GazeLib is a programming library which making real-time low-cost gaze tracking becomes possible. The library provide functions performing remote gaze tracking under ambient lighting condition using a single, low cost, off-the-shelf webcam. Developers can easily build gaze tracking technologies implemented applications in only few lines of code. GazeLib project focuses on promoting gaze tracking technology to consumer-grade human computer interfaces by reducing the price, emphasizing ease-of-use, increasing the extendibility, and enhancing the flexibility and mobility."

    Monday, April 26, 2010

    Freie Universität Berlin presents gaze controlled car

    From the Freie Universität in Berlin comes a working prototype for a systems that allows direct steering by eye movements alone. The prototype was demonstrated in front of a large group journalist at the former Berlin Tempelhof Airport. Gaze data from a head-mounted SMI eye tracker is feed into the control system of the Spirit of Berlin, a platform for autonomous navigation. Similar to the gaze controlled robot we presented at CHI09 the platform offers a coupling between the turning of the wheels and the gaze data coordinate space (eg. look left and car drives left). Essentially its a mapping onto a 2D plane where deviations from the center issues steering commands and the degree of turning is modulated by the distance. Potentially interesting when coupled with other sensors that in combination offers offer driver support, for example if an object in the vehicles path that driver has not seen. Not to mention scenarios including individuals with disabilities and/or machine learning. The work has been carried out under guidance by professor Raúl Rojas as part AutoNOMOS project which has been running since 2006 after inspiration from the Stanford autonomos car project.

    More info in the press-release.

    Sunday, April 25, 2010

    Wednesday, April 14, 2010

    Open-source gaze tracker awarded Research Pearls of ITU Copenhagen

    The open-source eye tracker ITU Gaze Tracker primarily developed by Javier San Augustin, Henrik Skovsgaard and myself has been awarded the Research Pearls of the IT University of Copenhagen. A presentation will be held at ITU on May 6th at 2pm. The software released one year ago have seen more than 5000 downloads by students and hobbyist around the world. It's rapidly approaching a new release which will offer better performance and stability for remote tracking and many bug fixes in general. The new version adds support for a whole range of new HD web cameras. These provides a vastly improved image quality that finally brings hope for a low-cost, open, flexible and reasonably performing solution. The ambitious goal strives to make eye tracking technology available for everyone, regardless of available resources. Follow the developments at the forum. Additional information is available at the ITU Gaze Group.

    "The Open-Source ITU Gaze Tracker"

    Gaze tracking offers them the possibility of interacting with a computer by just using eye movements, thereby making users more independent. However, some people (for example users with a severe disability) are excluded from access to gaze interaction due to the high prices of commercial systems (above 10.000€). Gaze tracking systems built from low-cost and off-the-shelf components have the potential of facilitating access to the technology and bring prices down.

    The ITU Gaze Tracker is an off-the-shelf system that uses an inexpensive web cam or a video camera to track the user’s eye. It is free and open-source, offering users the possibility of trying out gaze interaction technology for a cost as low as 20€, and to adapt and extend the software to suit specific needs.

    In this talk we will present the open-source ITU Gaze Tracker and show the different scenarios in which the system has been used and evaluated.

    Monday, April 12, 2010

    Digital avatars gets human-like eye movements

    William Steptoe of University College London got his research on using eye tracking to give digital avatars human-like eye movements covered in an article by New Scientist. It turns out that "on average, the participants were able to identify 88 per cent of truths correctly when the avatars had eye movement, but only 70 per cent without. Spotting lies was harder, but eye movement helped: 48 per cent accuracy compared with 39 per cent without. Steptoe will present the results at the 2010 Conference on Human Factors in Computing Systems in Atlanta, Georgia, next week."

    Eye tracking in the wild: Consumer decision-making process at the supermarket

    Kerstin Gidlöf from the Lund University Humlab talks about the visual appearance of consumer products in the supermarket and how the graphical layout modulates our attention. Perhaps the free will is just an illusion, however number of items in my fridge containing faces equals zero. Is it me or the store I'm shopping at?

    Monday, March 29, 2010

    Text 2.0 gaze assisted reading

    From the German Research Center for Artificial Intelligence comes a new demonstration of a gaze based reading system, Text 2.0, which utilizes eye tracking for making the reading experience more dynamic and interactive. For example the system can display images relevant to what your reading about or filter out less relevant information if your skimming through the content. The research is funded through the Stiftung Rheinland-Pfalz für Innovation. On the groups website you can also find an interesting project called PEEP which allows developers to connect eye trackers to Processing which enables aesthetically stunning visualizations. This platform is the core of the Text2.0 platform. Check out the videos.

    More information: Wenn das auge die seite umblaettert?
    Wired: Eye-Tracking Tablets and the Promise of Text 2.0
    More demos at the groups website

    Low-cost eye tracking and pong gaming from Imperial College London

    A group of students at the Imperial College London have develop a low-cost head mounted tracker which they use to play Pong with. The work is carried out under supervision of Aldo Faisal in his lab.

    We built an eyetracking system using mass-marketed off-the shelf components at 1/1000 of that cost, i.e. for less then 30 GBP. Once we made such a system that cheap we started thinking of it as a user interface for everyday use for impaired people.. The project was enable by realising that certain mass-marketed web cameras for video game consoles offer impressive performance approaching that of much more expensive research grade cameras.

    "From this starting point research in our group has focussed on two parts so far:

    1. The TED software, which is composed of two components which can run on two different computers (connected by wireless internet) or run on the same computer. The first component is the TED server (Linux-based) which interfaces directly with the cameras and processes the high-speed video feed and makes the data available (over the internet) to the client software. The client forms the second components, it is written in Java (i.e. it runs on any computer, Windows, Mac, Unix, ...) and provides the Mouse-control-via-eye-movements, the “Pong” video game as well as configuration and calibration functions.

    This two part solution allows the cameras to be connected to a cost-effective netbook (e.g. on a wheel chair) and allow control of other computers over the internet (e.g. in the living room, office and kitchen). This software suite, as well as part of the low-level camera driver was implemented by Ian Beer, Aaron Berk, Oliver Rogers and Timothy Treglown, for their undergraduate project in the lab.

    Note:the “Pong” video game has a two player mode, allowing two people to play against each other using two eye-trackers or eye-tracker vs keyboard. It is very easy to use, just look where you want the pong paddle to move...

    2. The camera-spectacles (visible in most press photos), as well as a two-camera software (Windows-based) able to track eye-movements in 3D (i.e. direction and distance) for wheelchair control. These have been build and developed by William Abbott (Dept. of Bioengineering)."

    Further reading:

    Imperial College London press release: Playing “Pong” with the blink of an eye
    The Engineer: Eye-movement game targets disabled
    Engadget (German): Neurotechnologie: Pong mit Augenblinzeln gespielt in London

    Friday, March 26, 2010

    ETRA 2010 Proceedings now online

    The proceedings of the 2010 Symposium on Eye-Tracking Research Applications 2010, Austin, Texas March 22 - 24, 2010 is now online. Some kind soul (MrGaze?) decided to do the world a favor by uploading and keyword-tagging the papers onto the Slideshare website which is indexed by Google and other search engines. The wealth of information ensures days of interesting reading, several short papers and posters would have been interesting to hear a talk on, but as always time is short.

    Paper Acceptance Rate: 18.00 of 58.00 submissions, 31%

    Conference chair

    Carlos Hitoshi Morimoto
    University of Sao Paulo, Brazil
    Howell Istance De Montfort University, UK
    Program chairs
    Aulikki Hyrskykari
    University of Tampere, Finland
    Qiang Ji
    Rensselaer Polytechnic Institute

    Table of Contents

    Front matter (cover, title page, table of content, preface)

    Back matter (committees and reviewers, industrial supporters, cover image credits, author index)

    SESSION: Keynote address

    An eye on input: research challenges in using the eye for computer input control
    I. Scott MacKenzie

    PdfPdf (1.52 MB). View online.
    Additional Information: full citation, abstract

    Long papers 1 -- Advances in eye tracking technology

    Homography normalization for robust gaze estimation in uncalibrated setups
    Dan Witzner Hansen, Javier San Agustin, Arantxa Villanueva

    PdfPdf(942 KB). View online.
    Additional Information: full citation, abstract, references

    Head-mounted eye-tracking of infants' natural interactions: a new method
    John M. Franchak, Kari S. Kretch, Kasey C. Soska, Jason S. Babcock, Karen E. Adolph (awarded best paper)

    PdfPdf (3.68 MB). View online.
    Additional Information: full citation, abstract, references

    User-calibration-free remote gaze estimation system
    Dmitri Model, Moshe Eizenman

    PdfPdf (452 KB). View online.

    Additional Information: full citation, abstract, references

    Short papers 1 -- Eye tracking applications and data analysis

    Eye movement as an interaction mechanism for relevance feedback in a content-based image retrieval system
    Yun Zhang, Hong Fu, Zhen Liang, Zheru Chi, Dagan Feng

    PdfPdf (1.20 MB). View online.

    Content-based image retrieval using a combination of visual features and eye tracking data
    Zhen Liang, Hong Fu, Yun Zhang, Zheru Chi, Dagan Feng

    PdfPdf(877 KB). View online.

    Gaze scribing in physics problem solving
    David Rosengrant

    PdfPdf (268 KB). View online.

    Have you seen any of these men?: looking at whether eyewitnesses use scanpaths to recognize suspects in photo lineups
    Sheree Josephson, Michael E. Holmes

    PdfPdf (660 KB). View online.
    Additional Information: full citation, abstract, references

    Estimation of viewer's response for contextual understanding of tasks using features of eye-movements
    Minoru Nakayama, Yuko Hayashi

    PdfPdf(152 KB). View online.

    Biometric identification via an oculomotor plant mathematical model
    Oleg V. Komogortsev, Sampath Jayarathna, Cecilia R. Aragon, Mechehoul Mahmoud

    PdfPdf (301 KB). View online.

    Short papers 2 -- Poster presentations

    Saliency-based decision support
    Roxanne L. Canosa

    PdfPdf (177 KB). View online.
    Additional Information: full citation, abstract, references

    Qualitative and quantitative scoring and evaluation of the eye movement classification algorithms
    Oleg V. Komogortsev, Sampath Jayarathna, Do Hyong Koh, Sandeep Munikrishne Gowda

    PdfPdf (383 KB). View online.

    An interactive interface for remote administration of clinical tests based on eye tracking
    A. Faro, D. Giordano, C. Spampinato, D. De Tommaso, S. Ullo

    PdfPdf (769 KB). View online.
    Additional Information: full citation, abstract, references

    Visual attention for implicit relevance feedback in a content based image retrieval
    A. Faro, D. Giordano, C. Pino, C. Spampinato

    PdfPdf (4.98 MB). View online.
    Additional Information: full citation, abstract, references

    Evaluation of a low-cost open-source gaze tracker
    Javier San Agustin, Henrik Skovsgaard, Emilie Mollenbach, Maria Barret, Martin Tall, Dan Witzner Hansen, John Paulin Hansen

    PdfPdf (287 KB). View online.

    An open source eye-gaze interface: expanding the adoption of eye-gaze in everyday applications
    Craig Hennessey, Andrew T. Duchowski

    PdfPdf(390 KB). View online.
    Additional Information: full citation, abstract, references

    Using eye tracking to investigate important cues for representative creature motion
    Meredith McLendon, Ann McNamara, Tim McLaughlin, Ravindra Dwivedi

    PdfPdf (661 KB). View online.
    Additional Information: full citation, abstract, references

    Eye and pointer coordination in search and selection tasks
    Hans-Joachim Bieg, Lewis L. Chuang, Roland W. Fleming, Harald Reiterer, Heinrich H. Bülthoff

    PdfPdf(934 KB). View online.

    Pies with EYEs: the limits of hierarchical pie menus in gaze control
    Mario H. Urbina, Maike Lorenz, Anke Huckauf

    PdfPdf (957 KB). View online.

    Measuring vergence over stereoscopic video with a remote eye tracker
    Brian C. Daugherty, Andrew T. Duchowski, Donald H. House, Celambarasan Ramasamy

    PdfPdf (1.78 MB). View online.

    Group-wise similarity and classification of aggregate scanpaths
    Thomas Grindinger, Andrew T. Duchowski, Michael Sawyer

    PdfPdf (2.82 MB). View online.

    Inferring object relevance from gaze in dynamic scenes
    Melih Kandemir, Veli-Matti Saarinen, Samuel Kaski

    PdfPdf (240 KB). View online.

    Advanced gaze visualizations for three-dimensional virtual environments
    Sophie Stellmach, Lennart Nacke, Raimund Dachselt

    PdfPdf (7.33 MB). View online.

    The use of eye tracking for PC energy management
    Vasily G. Moshnyaga

    PdfPdf (413 KB). View online.

    Low-latency combined eye and head tracking system for teleoperating a robotic head in real-time
    Stefan Kohlbecher, Klaus Bartl, Stanislavs Bardins, Erich Schneider

    PdfPdf (326 KB). View online.
    Additional Information: full citation, abstract, references

    Visual search in the (un)real world: how head-mounted displays affect eye movements, head movements and target detection
    Tobit Kollenberg, Alexander Neumann, Dorothe Schneider, Tessa-Karina Tews, Thomas Hermann, Helge Ritter, Angelika Dierker, Hendrik Koesling

    PdfPdf (479 KB). View online.

    Visual span and other parameters for the generation of heatmaps
    Pieter Blignaut

    PdfPdf (1.01 MB). View online.

    Robust optical eye detection during head movement
    Jeffrey B. Mulligan, Kevin N. Gabayan

    PdfPdf (325 KB). View online.
    Additional Information: full citation, abstract, references

    What you see is where you go: testing a gaze-driven power wheelchair for individuals with severe multiple disabilities
    Erik Wästlund, Kay Sponseller, Ola Pettersson

    PdfPdf (611 KB). View online.

    A depth compensation method for cross-ratio based eye tracking
    Flavio L. Coutinho, Carlos H. Morimoto

    PdfPdf (311 KB). View online.

    Estimating cognitive load using remote eye tracking in a driving simulator
    Oskar Palinko, Andrew L. Kun, Alexander Shyrokov, Peter Heeman

    PdfPdf (402 KB). View online.

    Small-target selection with gaze alone
    Henrik Skovsgaard, Julio C. Mateo, John M. Flach, John Paulin Hansen

    PdfPdf (274 KB). View online.

    Measuring situation awareness of surgeons in laparoscopic training
    Geoffrey Tien, M. Stella Atkins, Bin Zheng, Colin Swindells

    PdfPdf (810 KB). View online.

    Quantification of aesthetic viewing using eye-tracking technology: the influence of previous training in apparel design
    Juyeon Park, Emily Woods, Marilyn DeLong

    PdfPdf (3.98 MB). View online.
    Additional Information: full citation, abstract, references

    Estimating 3D point-of-regard and visualizing gaze trajectories under natural head movements
    Kentaro Takemura, Yuji Kohashi, Tsuyoshi Suenaga, Jun Takamatsu, Tsukasa Ogasawara

    PdfPdf (443 KB). View online.

    Natural scene statistics at stereo fixations
    Yang Liu, Lawrence K. Cormack, Alan C. Bovik

    PdfPdf (250 KB). View online.
    Additional Information: full citation, abstract, references

    Development of eye-tracking pen display based on stereo bright pupil technique
    Michiya Yamamoto, Takashi Nagamatsu, Tomio Watanabe

    PdfPdf (988 KB). View online.

    Pupil center detection in low resolution images
    Detlev Droege, Dietrich Paulus

    PdfPdf (499 KB). View online.

    Using vision and voice to create a multimodal interface for Microsoft Word 2007
    T. R. Beelders, P. J. Blignaut

    PdfPdf (237 KB). View online.
    Additional Information: full citation, abstract, references

    Single gaze gestures
    Emilie Møllenbach, Martin Lillholm, Alastair Gail, John Paulin Hansen

    PdfPdf (279 KB). View online.

    Learning relevant eye movement feature spaces across users
    Zakria Hussain, Kitsuchart Pasupa, John Shawe-Taylor

    PdfPdf (788 KB). View online.

    Towards task-independent person authentication using eye movement signals
    Tomi Kinnunen, Filip Sedlak, Roman Bednarik

    PdfPdf (373 KB). View online.

    Gaze-based web search: the impact of interface design on search result selection
    Yvonne Kammerer, Wolfgang Beinhauer

    PdfPdf (347 KB). View online.

    Eye tracking with the adaptive optics scanning laser ophthalmoscope
    Scott B. Stevenson, Austin Roorda, Girish Kumar

    PdfPdf (1.35 MB). View online.
    Additional Information: full citation, abstract, references

    Listing's and Donders' laws and the estimation of the point-of-gaze
    Elias D. Guestrin, Moshe Eizenman

    PdfPdf(304 KB). View online.
    Additional Information: full citation, abstract, references

    Long papers 2 -- Scanpath representation and comparison methods

    Visual scanpath representation
    Joseph H. Goldberg, Jonathan I. Helfman

    PdfPdf (1.68 MB). View online.

    A vector-based, multidimensional scanpath similarity measure
    Halszka Jarodzka, Kenneth Holmqvist, Marcus Nyström

    PdfPdf (425 KB). View online.

    Scanpath comparison revisited
    Andrew T. Duchowski, Jason Driver, Sheriff Jolaoso, William Tan, Beverly N. Ramey, Ami Robbins

    PdfPdf (1.34 MB). View online.

    Long papers 3 -- Analysis and interpretation of eye movements

    Scanpath clustering and aggregation
    Joseph H. Goldberg, Jonathan I. Helfman

    PdfPdf (636 KB). View online.

    Match-moving for area-based analysis of eye movements in natural tasks
    Wayne J. Ryan, Andrew T. Duchowski, Ellen A. Vincent, Dina Battisto

    PdfPdf (10.41 MB). View online.

    Interpretation of geometric shapes: an eye movement study
    Miquel Prats, Steve Garner, Iestyn Jowers, Alison McKay, Nieves Pedreira

    PdfPdf (1.73 MB). View online.

    Short papers 3 -- Advances in eye tracking technology

    User-calibration-free gaze tracking with estimation of the horizontal angles between the visual and the optical axes of both eyes
    Takashi Nagamatsu, Ryuichi Sugano, Yukina Iwamoto, Junzo Kamahara, Naoki Tanaka

    PdfPdf (573 KB). View online.

    Gaze estimation method based on an aspherical model of the cornea: surface of revolution about the optical axis of the eye
    Takashi Nagamatsu, Yukina Iwamoto, Junzo Kamahara, Naoki Tanaka, Michiya Yamamoto

    PdfPdf (269 KB). View online.

    The pupillometric precision of a remote video eye tracker
    Jeff Klingner

    PdfPdf (3.80 MB). View online.

    Contingency evaluation of gaze-contingent displays for real-time visual field simulations
    Margarita Vinnikov, Robert S. Allison

    PdfPdf (226 KB). View online.
    Additional Information: full citation, abstract, references

    SemantiCode: using content similarity and database-driven matching to code wearable eyetracker gaze data
    Daniel F. Pontillo, Thomas B. Kinsman, Jeff B. Pelz

    Pages: 267-270 PdfPdf (2.34 MB). View online.
    Additional Information: full citation, abstract, references

    Context switching for fast key selection in text entry applications
    Carlos H. Morimoto, Arnon Amir
    Pages: 271-274 PdfPdf (1.24 MB). View online.

    Long papers 4 -- Analysis and understanding of visual tasks

    Fixation-aligned pupillary response averaging
    Jeff Klingner

    PdfPdf (935 KB). View online.

    Understanding the benefits of gaze enhanced visual search
    Pernilla Qvarfordt, Jacob T. Biehl, Gene Golovchinsky, Tony Dunningan

    Pages: 283-290 PdfPdf (694 KB). View online.

    Image ranking with implicit feedback from eye movements
    David R. Hardoon, Kitsuchart Pasupa

    PdfPdf (409 KB). View online.

    Long papers 5 -- Gaze interfaces and interactions

    How the interface design influences users' spontaneous trustworthiness evaluations of web search results: comparing a list and a grid interface
    Yvonne Kammerer, Peter Gerjets

    PdfPdf (349 KB). View online.

    Space-variant spatio-temporal filtering of video for gaze visualization and perceptual learning
    Michael Dorr, Halszka Jarodzka, Erhardt Barth

    PdfPdf (188 KB). View online.

    Alternatives to single character entry and dwell time selection on eye typing
    Mario H. Urbina, Anke Huckauf

    PdfPdf (802 KB). View online.

    Long papers 6 -- Eye tracking and accessibility

    Designing gaze gestures for gaming: an investigation of performance
    Howell Istance, Aulikki Hyrskykari, Lauri Immonen, Santtu Mansikkamaa, Stephen Vickers

    PdfPdf (760 KB). View online.

    ceCursor, a contextual eye cursor for general pointing in windows environments
    Marco Porta, Alice Ravarelli, Giovanni Spagnoli

    PdfPdf (884 KB). View online.

    BlinkWrite2: an improved text entry method using eye blinks
    Behrooz Ashtiani, I. Scott MacKenzie

    PdfPdf (1.50 MB). View online.