Sunday, June 22, 2008

Lund Eye Tracking Acadamy

From the Lund Eye Tracking Academy (LETA) comes an excellent text on the more practical aspects of eye tracking in research settings.

"This text is about how to record good quality eye-tracking data from commercially available video-oculographic eye-tracking system, and how to derive and use the measures that an eye-tracker can give. The ambition is to cover as many as possible of the measures used in science and applied research. The need for a guide on how to use eye-tracking data has grown during the past years, as an effect of increasing interest in eye-tracking research. Due to the versatility of new measurement techniques and the important role of human vision in all walks of life, eye-tracking is now used in reading research, neurology, advertisement studies, psycholinguistics, human factors, usability studies, scene perception, and many other fields "

Download the document (PDF, 93 pages, 19Mb)

Tuesday, June 17, 2008

Open Source Eye Tracking

Zafer Savas is an electronics engineer living in Ankara in Turkey who has released an eye tracker implemented in C++ using the OpenCV library. The project is well documented and enables crude DIY eye tracking within minutes, hence it is a good starting point. It contains a sample database of eye images used to train the algorithm, the program has a function to capture images so that users can build their own database. Great!

Download the executable as well as the source code. See the extensive documentation.

You will also need to download the OpenCV library.

"The purpose of the project is to implement a real-time eye-feature tracker with the following capabilities:

  • RealTime face tracking with scale and rotation invariance
  • Tracking the eye areas individually
  • Tracking eye features
  • Eye gaze direction finding
  • Remote controlling using eye movements

The implemented project is on three components:

  1. Face detection: Performs scale invariant face detection
  2. Eye detection: Both eyes are detected as a result of this step
  3. Eye feature extraction: Features of eyes are extracted at the end of this step

Two different methods for face detection were implemented in the project:

  1. Continuously Adaptive Means-Shift Algorithm
  2. Haar Face Detection method

Two different methods of eye tracking were implemented in the project:

  1. Template-Matching
  2. Adaptive EigenEye Method

schematics for the procedure/algorithms

Wednesday, June 11, 2008

Response to demonstration video

The response for my video demonstration has been overwhelming with several articles and over 7000 viewings in a during the first three days (google neovisus). There has been an interest from the gaming community, Human–Computer Interaction researchers, users and caretakers in the assistive technology sector, medical and science applications such as radiology.

I wish give thanks to everyone for their feedback. At times during the long days and nights in the lab I've been wondering if this would be just another paper in the pile. The response has proven the interest and gives motivation for continuous work. Right now I feel that this is just the beginning and determined to take it towards the next level.

Please do not hesitate to contact me with ideas and suggestions.

Thank you.

Monday, June 9, 2008

Video demonstration online

The associated video demonstration for my masters thesis on gaze interaction is now online!

Click on the image to view or download the video.

Thesis release

My masters thesis on gaze interaction, the NeoVisus prototype, is now out! Download it by clicking on the image below.

Thursday, June 5, 2008

MedioVis at University of Konstanz

German student Simon Fäh uses Speech Recognition and Eye-Tracking to get a more fluid control of the visual information seeking system MedioVis. (video in German)

Tuesday, June 3, 2008

Eye typing at the Bauhaus University of Weimar

The Psychophysiology and Perception group, part of the faculty of Media at the Bauhaus University of Weimar are conducting research on gaze based text entry. Their past research projects include the Qwerty on-screen dwell based keyboard, IWrite, pEYEWrite and StarWrite. Thanks to Mario Urbina for notification.

"Qwerty is based on dwell time selection. Here the user has to stare for 500 ms a determinate character to select it. QWERTY served us, as comparison base line for the new eye typing systems. It was implemented in C++ using QT libraries."

"A simple way to perform a selection based on saccadic movement is to select an item by looking at it and confirm its selection by gazing towards a defined place or item. Iwrite is based on screen buttons. We implemented an outer frame as screen button. That is to say, characters are selected by gazing towards the outer frame of the application. This lets the text window in the middle of the screen for comfortable and safe text review. The order of the characters, parallel to the display borders, should reduce errors like the unintentional selection of items situated in the path as one moves across to the screen button.The strength of this interface lies on its simplicity of use. Additionally, it takes full advantage of the velocity of short saccade selection. Number and symbol entry mode was implemented for this editor in the lower frame. Iwrite was implemented in C++ using QT libraries."
"Pie menus have already been shown to be powerful menus for mouse or stylus control. They are two-dimensional, circular menus, containing menu items displayed as pie-formed slices. Finding a trade-off between user interfaces for novice and expert users is one of the main challenges in the design of an interface, especially in gaze control, as it is less conventional and utilized than input controlled by hand. One of the main advantages of pie menus is that interaction is very easy to learn. A pie menu presents items always in the same position, so users can match predetermined gestures with their corresponding actions. We therefore decided to transfer pie menus to gaze control and try it out for an eye typing approach. We designed the Pie menu for six items and two depth layers. With this configuration we can present (6 x 6) 36 items. The first layer contains groups of five letters ordered in pie slices.."

In StarWrite, selection is also based on saccadic movements to avoid dwell times. The idea of StarWrite is to combine eye typing movements with feedback. Users, mostly novices, tend to look to the text field after each selection to check what has been written. Here letters are typed by dragging them into the text field. This provides instantaneous visual feedback and should spare checking saccades towards text field. When a character is fixated, both it and its neighbors are highlighted and enlarged in order to facilitate the character selection. In order to use x- and y-coordinates for target selection, letters were arranged alphabetically on a half-circle in the upper part of the monitor. The text window appeared in the lower field. StarWrite provides a lower case as well, upper case, and numerical entry modes, that can be switched by fixating for 500 milliseconds the corresponding buttons, situated on the lower part of the application. There are also placed the space, delete and enter keys, which are driven by a 500 ms dwell time too. StarWrite was implemented in C++ using OpenGL libraries for the visualization."

Associated publications
  • Huckauf, A. and Urbina, M. H. 2008. Gazing with pEYEs: towards a universal input for various applications. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications (Savannah, Georgia, March 26 - 28, 2008). ETRA '08. ACM, New York, NY, 51-54. [URL] [PDF] [BIB]
  • Urbina, M. H. and Huckauf, A. 2007. Dwell time free eye typing approaches. In Proceedings of the 3rd Conference on Communication by Gaze Interaction - COGAIN 2007, September 2007, Leicester, UK, 65--70. Available online at [PDF] [BIB]
  • Huckauf, A. and Urbina, M. 2007. Gazing with pEYE: new concepts in eye typing. In Proceedings of the 4th Symposium on Applied Perception in Graphics and Visualization (Tubingen, Germany, July 25 - 27, 2007). APGV '07, vol. 253. ACM, New York, NY, 141-141. [URL] [PDF] [BIB]
  • Urbina, M. H. and Huckauf, A. 2007. pEYEdit: Gaze-based text entry via pie menus. In Conference Abstracts. 14th European Conference on Eye Movements ECEM2007. Kliegl, R. & Brenstein, R. (Eds.) (2007), 165-165.

Sunday, June 1, 2008

Project finalization and thesis defence

Last Thursday I held the defense for my masters thesis at the Cognitive Science department at Lund University. I wish to thank Dr. Björn Samuelsson for his excellent, and well beyond expected, review of the thesis. Björn comes from a background in Theoretical Physics / Complex Systems with an international recognition in Random Boolean Networks. I kindly appreciate his great detail of attention which raises the overall outcome of my project, hopefully to the level of international scientific publications.

I´ve put together a video demonstration that together with the thesis will be released this week. It demonstrates the U.I components and the prototype interface and will be posted to the major video sharing networks (YouTube etc.)