Monday, April 28, 2008

SWAET Conference

The first day of the SWAET 2008 conference at Lund University was filled with interesting presentations and inspiring conversations. Finally I had a chance to meet some of the researchers that I´ve know mainly by their publications and last names.

From the schedule of day one three talks stand out in the field of gaze interaction. These are recorded on video with the intention of future online distribution. For now enjoy these papers.
The conference was attended by three manufacturers, SMI, SmartEye and Tobii which all had systems on display.

The SMI system demonstrated was brought up from the dark dungeons of the lab to host the prototype I have been working on for the last couple of months. In general, it was well received and served its purpose of a eye catching demonstration of my "next-gen" gaze interface. To sum up it was great to get out there to gather feedback confirming that I´m on the right track. (doubts sometimes rise when working solo) The day ended with a a lovely dinner at the university´s finest dining hall.

Big thank you goes out to Kenneth Holmqvist, Jana Holsanova, Philip Diderichsen, Nils Holmberg, Richard Andersson and Janna Spanne for hosting this event.

Monday, April 21, 2008

Open invitation to participate in the evaluation

The invitation to participate in the evaluation of the prototype is out. If you have the possibility to participate I would be most thankful for your time. The entire test takes roughly 30 minutes.


The invitation can be downloaded as pdf.

If you wish to participate send me a message. Thank you.

More information on the procedure and structure of the evaluation as well as gained experience will be posted once the testing is completed. The final results will be published in my master thesis by the end of May. Stay tuned.

Wednesday, April 16, 2008

Alea Technologies IntelliGaze demo

A demonstration of the IG30Pro remote eye tracker used to play a first person shooter game. Notice the algorithms used to stabilize the pointer and filtering out the jitter usually associated with eye trackers.


Tuesday, April 15, 2008

Alea Technologies

The German firm Alea Technologies offers a solution called IntelliGaze consisting of a remote based eye tracking system as well as a software suite. The system is designed to be a flexible solution in terms of both hardware and software. The eye tracker is stand-alone and can be used to create a customizable setup when it is combined with a various displays. Additionally, they provide API´s for application developments.

A clear usage for their technology is users with disabilities such as ALS. The software contains a "desktop" system which acts as a launcher for other applications (Windows natives, Grid, Cogain). In general, they seem to target Tobii Technologies who have been very successful with their MyTobii application running on the P10 eye tracker. The game is on.


Quote:
"The ease of setup and intuitive operation bring a completely new degree of freedom to patients who had to rely on older technologies like manual scanning or cumbersome pointing devices. By using the latest camera technology and very sophisticated image processing and calibration methods, the IG-30 system is far superior to alternative gaze input systems at a similar price point. The modular structure of the system allows the patients with a degenerative disease the continued use of their touch-screen or scanning software package. Using existing computers and monitors, the IG-30 system can also be easily integrated into an existing PC setup. The economic pricing of the IntelliGazeTM systems opens this high-tech solution to a much wider range of users."

Tracking technology
Hybrid infrared video eye- & head-tracking Binocular & monocular tracking
Working Volume
centered at 600 mm distance
300 x 200 x 200 mm3 [WxHxD]
Accuracy , static 0.5°, typical
Accuracy , over full working volume 1°, typical
Sampling Rate 50 Hz
Max. head-movement velocity 15 cm/s
Recovery-time after tracking loss
(head was too fast or moved out of range)
40 ms
System Dimensions ca. 300 x 45 x 80 mm3 [WxHxD]
Mounting Options on monitor via VESA-adapter
on Tablet-PC via customized interfaces
System Weight ca. 1,2 kg.

Gaze Interaction Demo (Powerwall@Konstanz Uni.)

During the last few years quite a few wall sized displays have been used for novel interaction methods. Not seldomly these have been used with multi-touch, such as the Jeff Han´s FTIR technology. This is the first demonstration I have seen where eye tracking is used for a similar purpose. A German Ph.D candidate, Jo Bieg, is working on this out of the HCI department at the University of Konstanz. The Powerwall is 5.20 x 2.15M and has a resolution of 4640 x 1920.



The demonstration can be view at a better quality (10Mb)

Also make sure to check out the 360 deg. Globorama display demonstration. It does not use eye tracking for interaction but a laser pointer. Nevertheless, really cool immersive experience, especially the Google Earth zoom in to 360 panoramas.

Tuesday, April 8, 2008

Inspiration: EyeMusic & EyeFollower by Interactive Minds

From the German based lab Interactive Minds provides the EyeMusic interface which lets you play songs. Not much information is available except the screenshot below.


However, their eye tracker "EyeFollower" seems more impressive. 120Hz sampling and 0.45 deg. accuracy. Furthermore, it allows for larger and faster head movements than most other remote based systems. This is really important for making the tracking flexible, which is what you want with a remote based system in the first place. People move around, we all change posture over time. Imagine sitting at a desk then turning to the side taking notes, talking on the phone. When you return to the computer it should instantaneously continue to track your gaze without any noticeable delay (or having to re-calibrate) These are not easy tasks to solve but are necessary for the advancement of eye tracking/gaze interaction. Interactive Minds provides a demonstration video of their EyeFollower. Looks great.

Sunday, April 6, 2008

Inspiration: Looking my Way through the Menu: The Impact of Menu Design and Multimodal Input on Gaze-based Menu Selection

As discussed earlier in my blog the differences between gaze vs mouse based interaction calls for interfaces that are especially designed for the purpose. A group of German researchers present a novel approach based on a radial menu layout. The work has been carried out by Yvonne Kammerer and Katharina Scheiter both at the Knowledge Media Research Center, University of Tuebingen in conjunction with Wolfgang Beinhauer at the Fraunhofer Institute for Industrial Engineering, Stuttgart.

My own prototype contains U.I elements that are based on the same on the style of interaction. However, this work goes further towards a multi-level menu system while my component is aiming more for a quick one-level saccade selection. The advantages of the radial layout in a gaze based menu component is discussed in this paper. Interesting concept, looking forward to the presentation at the SWAET2008 conference.

Abstract
"In this paper a study is reported, which investigates the effectiveness of two approaches to improving gaze-based interaction for realistic and complex menu selection tasks. The first approach focuses on identifying menu designs for hierarchical menus that are particularly suitable for gaze-based interaction, whereas the second approach is based on the idea of combining gaze-based interaction with speech as a second input modality. In an experiment with 40 participants the impact of menu design, input device, and navigation complexity on accuracy and completion time in a menu selection task as well as on user satisfaction were investigated. The results concerning both objective task performance and subjective ratings confirmed our expectations in that a semi-circle menu was better suited for gaze-based menu selection than either a linear or a full-circle menu. Contrary to our expectations, an input device solely based on eye gazes turned out to be superior to the combined gaze- and speech-based device. Moreover, the drawbacks of a less suitable menu design (i.e., of a linear menu or a full-circle menu) as well as of the multimodal input device particularly obstructed performance in the case of more complex navigational tasks." Download paper as pdf.




Tuesday, April 1, 2008

Eye Tracking, Algorithms and Mathematical Modelling (Leimberg & Vester-Christensen, 2005)

Extensive and technical publication on eye tracking algorithms by D. Leimberg and M. Vester-Christensen at the Department of Informatics and Mathematical Modeling at the Technical University of Denmark. The thesis contains a range of approaches and discussion of the implementation of these. Rich with illustrations and examples of the result of various methods.

Abstract
This thesis presents a complete system for eye tracking avoiding restrictions on head movements. A learning-based deformable model - Active Appearance Model (AAM) - is utilized for detection and tracking of the face. Several methods are proposed, described and tested for eye tracking, leading to determination of gaze. The {AAM} is used for a segmentation of the eye region, as well as providing an estimate of the pose of the head.

Among several, we propose a deformable template based eye tracker, combining high speed and accuracy, independently of the resolution. We compare with a state of the art active contour approach, showing that our method is more accurate. We conclude, that eye tracking using standard consumer cameras is feasible providing an accuracy within the measurable range." Download paper as pdf


Following up are two papers (also found at the end of the thesis)
  • M. Vester-Christensen, D. Leimberg, B. K. Ersbøll, L. K. Hansen, Deformable Models for Eye Tracking, Den 14. Danske Konference i Mønstergenkendelse og Billedanalyse, 2005 [full] [bibtex] [pdf]

  • D. Leimberg, M. Vester-Christensen, B. K. Ersbøll, L. K. Hansen, Heuristics for speeding up gaze estimation, Proc. Svenska Symposium i Bildanalys, SSBA 2005, Malmø, Sweden, SSBA, 2005 [full] [bibtex] [pdf]

Novel Eye Gaze Tracking Techniques Under Natural Head Movement (Zhiwei&Qiang, 2007)

Intelligent Systems Lab at Rensselear Polytechnic Institute, US. Using a stereo setup Zhu Zhiwei and Ji Qiang are able to produce a head movement tolerant eye tracker that calibrates in less than five seconds and provides around 1.6 deg. accuracy with 25 frames per second.

Accuracy illustration. Targets in blue. Tracked fixations in red.



Zhiwei Zhu and Qiang Ji, Novel Eye Gaze Tracking Techniques Under Natural Head Movement, IEEE Transactions on Biomedical Engineering, 54(12), p2246-60,2007. download the paper.

A neural network based real-time gaze tracker

Continuing on the neural network based eye trackers from the earlier post, this work by Nischal Piratla back in 2001 while his time at the Colorado State University. It uses a low resolution CCD camera in combination with a 3 layer back propagation neural network. Although low frame rates, 5 per second, it´s not too bad for running on a Pentium II 266MHz. However, the horizontal bar on the forehead would not do today =)

Abstract
"A real-time gaze-tracking system that estimates the user's eye gaze and computes the window of focused view on a computer monitor has been developed. This artificial neural network based system can be trained and customized for an individual. Unlike existing systems in which skin color features and/or other mountable equipment are needed, this system is based on a simple non-intrusive camera mounted on the monitor. Gaze point is accurately estimated within a 1 in. on a 19-in. monitor with a CCD camera having a 640 × 480 image resolution. The system performance is independent of user's forward and backward as well as upward and downward movements. The gaze-tracking system implementation and the factors affecting its performance are discussed and analyzed in detail. The features and implementation methods that make this system real-time are also explained."

Download paper as pdf