Thursday, July 10, 2008

Eye Gaze Interactive Air Traffic Controllers workstation (P.Esser & T.J.J Bos, 2007)

P.Esser and T.J.J Bos at the Maastricht University have developed a prototype for reducing the repetitive strain injuries Air Traffic Controllers sustain while operating their systems. The research was conducted at the National Aerospace Laboratory in the Netherlands. The results indicate a clear advantage compared to the traditional roller/track ball, especially for large distances. This is expected since Fitt's law does not apply in the same manner for eye movement as physical limb/hand movement. Sure eye movement over longer distances takes more time to perform than short ones but it does not compare to moving you arm one inch vs. one meter. Certainly there are more applications that could benifit from gaze assisted interaction, medical imaging in the field of radiology is one (such as CT, MRI, these produce very high resolution images with resolutions up to 4096x4096 pixels)


Summary of the thesis "Eye Gaze Interactive ACT workstation"
"Ongoing research is devoted to finding ways to improve performance and reduce workload of Air Traffic Controllers (ATCos) because their task is critical to the safe and efficient flow of air traffic. A new intuitive input method, known as eye gaze interaction, was expected to reduce the work- and task load imposed on the controllers by facilitating the interaction between the human and the ATC workstation. In turn, this may improve performance because the freed mental resources can be devoted to more critical aspects of the job, such as strategic planning. The objective of this Master thesis research was to explore how human computer interaction (HCI) in the ATC task can be improved using eye gaze input techniques and whether this will reduce workload for ATCos.


In conclusion, the results of eye gaze interaction are very promising for selection of aircraft on a radar screen. For entering instructions it was less advantageous. This is explained by the fact that in the first task the interaction is more intuitive while the latter is more a conscious selection task. For application in work environments with large displays or multiple displays eye gaze interaction is considered very promising. "



Download paper as pdf

Wednesday, July 9, 2008

GazeTalk 5

The GazeTalk system is one of the most comprehensive open solutions for gaze interaction today. It has been developed with the disabled users in mind and supports a wide range of everyday tasks. It dramatically increases the quality of life for the disabled suffering from ALS or similar conditions. The following information is quoted from the COGAIN website.

Information about Gazetalk 5 eye communication system

GazeTalk is a predictive text entry system that has a restricted on-screen keyboard with ambiguous layout for severely disabled people. The main reason for using such a keyboard layout is that it enables the use of an eye tracker with a low spatial resolution (e.g., a web-camera based eye tracker).

The goal of the GazeTalk project is to develop an eye-tracking based AAC system that supports several languages, facilitates fast text entry, and is both sufficiently feature-complete to be deployed as the primary AAC tool for users, yet sufficiently flexible and technically advanced to be used for research purposes. The system is designed for several target languages, initially Danish, English, Italian, German and Japanese.

Main features

  • type-to-talk
  • writing
  • email
  • web – browser
  • Multimedia – player
  • PDF – reader
  • letter and word prediction, and word completion
  • speech output
  • can be operated by gaze, headtracking, mouse, joystick, or any other pointing device
  • supports step-scanning (new!)
  • supports users with low precision in their movements, or trackers with low accuracy
  • allows the user to use Dasher inside GazeTalk and to transfer the text written in Dasher back to GazeTalk

GazeTalk 5.0 has been designed and developed by the Eye Gaze Interaction Group at the IT University of Copenhagen and the IT-Lab at the Royal School of Library and Information Science, Copenhagen.


gazetalk v5 screen shot gazetalk v5, linked with Dasher - screen shot

more info Read more About Gazetalk or view GazeTalk manual PDF icon

Short manual on data recording in Gazetalk Short manual on data recording in GazeTalk PDF icon

GazeTalk Videos

more info Download Gazetalk

Eye tracking using a webcamera

From the French ESIEA school of engineering comes a custom developed eye tracker using a simple web camera. It has head tracking capabilities and works in low light situations.



More information:
Hubert Wassner Blog (prof. of Comp.Sci) French / English (automatic translation)

Sunday, July 6, 2008

Eye tracking in space

The Eye Tracking Device (ETD) is used to determine the influence of prolonged microgravity and the accompanying vestibular (inner ear) adaptation on the orientation of Listings Plane (a coordinate framework, which is used to define the movement of the eyes in the head).

"The working hypothesis is that in microgravity the orientation of Listings Plane is altered, probably to a small and individually variable degree. Further, with the loss of the otolith-mediated gravitational reference, it is expected that changes in the orientation of the coordinate framework of the vestibular system occur, and thus a divergence between Listing?s Plane and the vestibular coordinate frame should be observed. While earlier ground-based experiments indicate that Listing?s Plane itself is to a small degree dependent on the pitch orientation to gravity, there is more compelling evidence of an alteration of the orientation of the vestibulo-ocular reflex (VOR), reflex eye movement that stabilizes images on the retina during head movement by producing an eye movement in the direction opposite to head movement, thus preserving the image on the center of the visual field, in microgravity.

Furthermore, changes in bodily function with relation to eye movement and spatial orientation that occur during prolonged disturbance of the vestibular system most likely play a major role in the problems with balance that astronauts experience following re-entry from space.

In view of the much larger living and working space in the ISS, and the extended program of spacewalks (EVAs) being planned, particular care must be given to assessing the reliability of functions related to eye movement and spatial orientation.

The performance of the experiments in space are therefore of interest for their expected contribution to basic research knowledge and to the improvement and assurance of human performance under weightless conditions."


NASA Image: ISS011E13710 - Cosmonaut Sergei K. Krikalev, Expedition 11 Commander representing Russia's Federal Space Agency, uses the Eye Tracking Device (ETD), a European Space Agency (ESA) payload in the Zvezda Service Module of the International Space Station. The ETD measures eye and head movements in space with great accuracy and precision.

"The ETD consists of a headset that includes two digital camera modules for binocular recording of horizontal, vertical and rotational eye movements and sensors to measure head movement. The second ETD component is a laptop PC, which permits digital storage of all image sequences and data for subsequent laboratory analysis. Listing's Plane can be examined fairly simply, provided accurate three-dimensional eye-in-head measurements can be made. Identical experimental protocols will be performed during the pre-flight, in-flight and post-flight periods of the mission. Accurate three-dimensional eye-in-head measurements are essential to the success of this experiment. The required measurement specifications (less than 0.1 degrees spatial resolution, 200 Hz sampling frequency) are fulfilled by the Eye Tracking Device (ETD)."

More information:
http://www.nasa.gov/mission_pages/station/science/experiments/ETD.html
http://www.spaceflight.esa.int/delta/documents/factsheet-delta-hp-etd.pdf
http://www.energia.ru/eng/iss/researches/medic-65.html

Realtime computer interaction via eye tracking (Dubey, 2004)

Premnath Dubey conducted research on eye tracking and gaze interaction for his masters thesis in 2004 at the Department of Computing at Curtin University in Australia.

Abstract
"This thesis presents a computer vision-based eye tracking system for human computer interaction. The eye tracking system allows the user to indicate a region of interest in a large data space and to magnify that area, without using traditional pointer devices. Presented is an iris tracking algorithm adapted from Camshift; an algorithm originally designed for face or hand tracking. Although the iris is much smaller and highly dynamic. the modified Camshift algorithm efficiently tracks the iris in real-time. Also presented is a method to map the iris centroid, in video coordinates to screen coordinates; and two novel calibration techniques, four point and one-point calibration. Results presented show that the accuracy for the proposed one-point calibration technique exceeds the accuracy obtained from calibrating with four points. The innovation behind the one-point calibration comes from using observed eye scanning behaviour to constrain the calibration process. Lastly, the thesis proposes a non-linear visualisation as an eye-tracking application, along with an implementation."

Download paper as PDF.

Thursday, July 3, 2008

Low cost eye tracking

Marcelo from Argentina have developed a low cost solution using the Logitech Quickcam Express webcamera. The video it produces has a resolution of 352 x 288 pixels. It is mounted close to the eye and with extra illumination from two lamps. Marcelos crude eye tracker relies on an elliptic fitting of the pupil in the visible light spectrum which differes from most commercial alternatives (which uses infrared light to create reflections on the eye ball, this is typically the second step to increase the accuracy)

Considering the low resolution of the camera and the simplicity of the setup the results are noteworthy. Hope to see more development of this!






Wednesday, July 2, 2008

Hot Zone prototype (Wakaruru)

The following video demonstrates a prototype of the Hot Zone system for controling windows applications. Call out the Hot Zone are made by pressing a single Hotkey and blink. The menu can be closed by looking outside the zone and blinking. Submitted to YouTube by "Wakaruru"


Submitted to YouTube by "Wakaruru"

The second video demonstrates how Hot Zone could be used to work with real world applications (PowerPoint). The center of the zone is located at the current gaze position when it's call out (like a context menu). the commands in the five zones depend on current selected object (based on what you are looking at now). No mouse needed (the cursor was designated as the gaze position using API). Pure blink without hotkey pressed was designated as single mouse click. Thus blink on the text area start text editing. Keyboard is only used for typing and hotkey to call out the zone. The eye tracking device used is ASL EH6000.


Submitted to YouTube by "Wakaruru"

Sunday, June 22, 2008

Lund Eye Tracking Acadamy

From the Lund Eye Tracking Academy (LETA) comes an excellent text on the more practical aspects of eye tracking in research settings.

"This text is about how to record good quality eye-tracking data from commercially available video-oculographic eye-tracking system, and how to derive and use the measures that an eye-tracker can give. The ambition is to cover as many as possible of the measures used in science and applied research. The need for a guide on how to use eye-tracking data has grown during the past years, as an effect of increasing interest in eye-tracking research. Due to the versatility of new measurement techniques and the important role of human vision in all walks of life, eye-tracking is now used in reading research, neurology, advertisement studies, psycholinguistics, human factors, usability studies, scene perception, and many other fields "

Download the document (PDF, 93 pages, 19Mb)

Tuesday, June 17, 2008

Open Source Eye Tracking

Zafer Savas is an electronics engineer living in Ankara in Turkey who has released an eye tracker implemented in C++ using the OpenCV library. The project is well documented and enables crude DIY eye tracking within minutes, hence it is a good starting point. It contains a sample database of eye images used to train the algorithm, the program has a function to capture images so that users can build their own database. Great!

Download the executable as well as the source code. See the extensive documentation.

You will also need to download the OpenCV library.

"The purpose of the project is to implement a real-time eye-feature tracker with the following capabilities:

  • RealTime face tracking with scale and rotation invariance
  • Tracking the eye areas individually
  • Tracking eye features
  • Eye gaze direction finding
  • Remote controlling using eye movements

The implemented project is on three components:

  1. Face detection: Performs scale invariant face detection
  2. Eye detection: Both eyes are detected as a result of this step
  3. Eye feature extraction: Features of eyes are extracted at the end of this step

Two different methods for face detection were implemented in the project:

  1. Continuously Adaptive Means-Shift Algorithm
  2. Haar Face Detection method

Two different methods of eye tracking were implemented in the project:

  1. Template-Matching
  2. Adaptive EigenEye Method

schematics for the procedure/algorithms

Wednesday, June 11, 2008

Response to demonstration video

The response for my video demonstration has been overwhelming with several articles and over 7000 viewings in a during the first three days (google neovisus). There has been an interest from the gaming community, Human–Computer Interaction researchers, users and caretakers in the assistive technology sector, medical and science applications such as radiology.

I wish give thanks to everyone for their feedback. At times during the long days and nights in the lab I've been wondering if this would be just another paper in the pile. The response has proven the interest and gives motivation for continuous work. Right now I feel that this is just the beginning and determined to take it towards the next level.

Please do not hesitate to contact me with ideas and suggestions.

Thank you.