Wednesday, January 21, 2009

Wearable EOG Goggles: Eye-Based Interaction in Everyday Environments

Andreas Bulling in the Wearable Computing Group at the Swiss Federal Insitute of Technology (ETH) is working on a new Electrooculography-based eye tracking system. This technology relies on the small but measurable electrical currents (potentials) created by the eye musculature. A set of electrodes are attached to the skin and after signal processing this data can be used for controlling computer interfaces or other devices. The obvious advantage of this method of eye tracking compared to the more traditional corneal reflection video-based methods is that its not sensitive to sunlight and may therefor be used outdoors. However, to my knowledge, it provide a lower accuracy, this results in most EOG interfaces relying on eye gestures rather than gaze fixations.

"We want to introduce the paradigm of visual perception and investigations on eye movements as new methods to implement novel and complement current context-aware systems. Therefore, we will investigate the potential but also possible limitations of using eye movements to perform context and activity recognition in wearable settings. Besides recognizing individual activities another focus will be put on long-term eye movement analysis." More information.

Recently Andreas got a paper accepted for the CHI 2009 conference in Boston (April 4-9th) where the system will be demonstrated during the interactivity session. Andreas and the team at ETH are planning to investigate attentive user interfaces (AUI) in mobile settings using wearable systems, such as the prototype demonstrated in the video below.

View on YouTube

SMI gets the International Forum Design Award

Congratulations to the guys at SensoMotoric Instruments (SMI) for winning the International Forum 2009 Product Design Award with their iView X™ RED eye tracker.

"The unobtrusive yet elegant design for the stand-alone as well as for the monitor-attached configuration of the eye tracking system convinced the jury. "

The award will be presented at the first day of CeBIT (3rd of March) in Hanover. The system will also be on display for those of you who are attending CeBIT. More information on the International Forum Award.

Saturday, January 3, 2009

The Argentinian Eye Mouse software released (Amaro & Ponieman)

Nicolás Amaro and Nicolás Ponieman at the ORT Argentina, recently got the Chamber of Industry and Trade Argentine-German Award for Innovation 2008 for their work on a low-cost (webcam) headmounted corneal reflection based solution. Best of all the software can be downloaded which will directly benefit those who are in need but cannot afford the state-of-the-art systems currently on the market. As demonstrated by the video below it is capable of running grid-based interfaces, thus it should be adequate for GazeTalk and similar.

View on YouTube

Friday, January 2, 2009

An Unobtrusive Method for Gaze Tracking (N. Chitrik & Y. Schwartzburg)

Nava Chitrik and Yuliy Schwartzburg have in partial fulfillment of their Senior Design Project Requirements constructed a low-cost approach for remote eye tracking at the Cooper Union for the Advancement of Science and Art, Electrical Engineering Department.

"The line of a person's gaze is known to have many important applications in artificial intelligence (AI) and video conferencing but determining where a user is looking is still a very challenging problem. Traditionally, gaze trackers have been implemented with devices worn around the user's head, but more recent advances in the field use unobtrusive methods, i.e. an external video camera, to obtain information about where a person is looking. We have developed a simplified gaze tracking system using a single camera and a single point source mounted compactly in the view of the user, a large simplification over previous methods which have used a plurality of each. Furthermore, our algorithms are robust enough to allow head motion and our image processing functions are designed to extract data even from low-resolution or noisy video streams. Our system also has the computational advantage of working with very small image sizes, reducing the amount of resources needed for gaze tracking, freeing them up for applications that might utilize this information.

To reiterate: The main differences between this implementation and similar implementations are that this system uses a histogram method as opposed to edge detection to work with very low resolution video extremely quickly. However, it requires an infrared camera and infrared LED's. (Which can be purchased for less than 25 dollars online.)"

View on YouTube

Monday, December 8, 2008

Journal of Eye Movement Research: Special issue on eye tracking now online.

The special issue on "Eye Tracking and Usability Research" in the Journal of Eye Movement Research is now online. It features the following articles:
  • Helmert, J. R., Pannasch, S. & Velichkovsky, B. M. (2008). Eye tracking and Usability Research: an introduction to the special issue (editorial). Download as PDF.

  • Castellini, C. (2008). Gaze Tracking in Semi-Autonomous Grasping. Download as PDF

  • Helmert, J. R., Pannasch, S. & Velichkovsky, B. M. (2008). Influences of dwell time and cursor control on the performance in gaze driven typing. Download as PDF.

  • Huckauf, A. & Urbina, M. H. (2008). On object selection in gaze controlled environments. Download as PDF.

  • Hyrskykari, A. & Ovaska, S., Majaranta, P., Räihä, K.-J. & Lehtinen, M. (2008). Gaze Path Stimulation in Retrospective Think-Aloud. Download as PDF.

  • Pannasch, S., Helmert, J.R., Malischke, S., Storch, A. & Velichkovsky, B.M. (2008). Eye typing in application: A comparison of two systems with ALS patients. Download as PDF.

  • Zambarbieri, D., Carniglia, E. & Robino, C. (2008). Eye Tracking Analysis in Reading Online Newspapers. Download as PDF.

Monday, November 24, 2008

Our gaze controlled robot on the DR News

The Danish National Television "TV-Avisen" episode on our gaze controlled robot was broadcasted Friday 22nd November for the nine o´ clock news. Alternative versions (resolution) of the video clip can be found at the DR site.








View video

Friday, November 21, 2008

Eye movement control of remote robot

Yesterday we demonstrated our gaze navigated robot at the Microsoft Robotics event here at ITU Copenhagen. The "robot" transmits a video which is displayed on a client computer. By using an eye tracker we can direct the robot towards where the user is looking. The concept allows for a human-machine interaction with a direct mapping of the users intention. The Danish National TV (DR) came by today and recorded a demonstration. It will be shown tonight at the nine o´ clock news. Below is a video that John Paulin Hansen recorded yesterday which demonstrates the system. Please notice that the frame-rate of the video stream was well below average at the time of recording. It worked better today. In the coming week we'll look into alternative solutions (suggestions appreciated) The projects has been carried out in collaboration with Alexandre Alapetite from DTU. His low-cost, LEGO-based rapid mobile robot prototype, gives interesting possibilities to test some human-computer and human-robot interaction.



The virgin tour around the ITU office corridor (on YouTube)



Available on YouTube

Tuesday, November 18, 2008

A framework for gaze selection techniques (Tonder et al., 2008)

Martin van Tonder, Charmain Cilliers and Jean Greyling at the Nelson Mandela Metropolitan University, South Africa presented a platform independent framework in the proceedings of the 2008 annual research conference of the South African Institute of Computer Scientists. The framework is platform independent (relying on Java) and supports multiple interaction methods such as Kumars EyePoint, popups, as well as data logging and visualization.

Abstract
Experimental gaze interaction techniques are typically prototyped from scratch using proprietary libraries provided by the manufacturers of eye tracking equipment. These libraries provide gaze data interfaces, but not any of the additional infrastructure that is common to the implementation of such techniques. This results in an unnecessary duplication of effort. In this paper, a framework for implementing gaze selection techniques is presented. It consists of two components: a gaze library to interface with the tracker and a set of classes which can be extended to implement different gaze selection techniques. The framework is tracker and operating system independent, ensuring compatibility with a wide range of systems. Support for user testing is also built into the system, enabling researchers to automate the presentation of est targets to users and record relevant test data. These features greatly simplify the process of implementing and evaluating new interaction techniques. The practicality and flexibility of the framework are demonstrated by the successful implementation of a number of gaze selection
techniques.
  • van Tonder, M., Cilliers, C., and Greyling, J. 2008. A framework for gaze selection techniques. In Proceedings of the 2008 Annual Research Conference of the South African institute of Computer Scientists and information Technologists on IT Research in Developing Countries: Riding the Wave of Technology (Wilderness, South Africa, October 06 - 08, 2008). SAICSIT '08, vol. 338. ACM, New York, NY, 267-275. DOI= http://doi.acm.org/10.1145/1456659.1456690

Monday, November 17, 2008

Wearable Augmented Reality System using Gaze Interaction (Park et al., 2008)

Hyung Min Park, Seok Han Lee and Jong Soo Choi from the Graduate School of Advanced Imaging Science, Multimedia & Film at the University of Chung-Ang, Korea presented a paper on their Wearable Augmented Reality System (WARS) at the 7th IEEE/ACM International Symposium on Mixed and Augmented Reality. They use a half-blink mode (called "aging") for selection which is detected by their custom eye tracking algorithms. See the end of the video.

Abstract
Undisturbed interaction is essential to provide immersive AR environments. There have been a lot of approaches to interact with VEs (virtual environments) so far, especially in hand metaphor. When the user‟s hands are being used for hand-based work such as maintenance and repair, necessity of alternative interaction technique has arisen. In recent research, hands-free gaze information is adopted to AR to perform original actions in concurrence with interaction. [3, 4]. There has been little progress on that research, still at a pilot study in a laboratory setting. In this paper, we introduce such a simple WARS(wearable augmented reality system) equipped with an HMD, scene camera, eye tracker. We propose „Aging‟ technique improving traditional dwell-time selection, demonstrate AR gallery – dynamic exhibition space with wearable system.
Download paper as PDF.

Tuesday, November 11, 2008

Gaze vs. Mouse in Games: The Effects on User Experience (Gowases T, Bednarik R, Tukiainen M)

Tersia Gowases, Roman Bednarik (blog) and Markku Tukiainen at the Department of Computer Science and Statistics, University of Joensuu, Finland got a paper published in the proceedings for the 16th International Conference on Computers in Education (ICCE).

"We did a simple questionnaire-based analysis. The results of the analysis show some promises for implementing gaze-augmented problem-solving interfaces. Users of gaze-augmented interaction felt more immersed than the users of other two modes - dwell-time based and computer mouse. Immersion, engagement, and user-experience in general are important aspects in educational interfaces; learners engage in completing the tasks and, for example, when facing a difficult task they do not give up that easily. We also did analysis of the strategies, and we will report on those soon. We could not attend the conference, but didn’t want to disappoint eventual audience. We thus decided to send a video instead of us. " (from Romans blog)




Abstract
"The possibilities of eye-tracking technologies in educational gaming are seemingly endless. The question we need to ask is what the effects of gaze-based interaction on user experience, strategy during learning and problem solving are. In this paper we evaluate the effects of two gaze based input techniques and mouse based interaction on user experience and immersion. In a between-subject study we found that although mouse interaction is the easiest and most natural way to interact during problemsolving, gaze-based interaction brings more subjective immersion. The findings provide a support for gaze interaction methods into computer-based educational environments." Download paper as PDF.


Some of this research has also been presented within the COGAIN association, see:
  • Gowases Tersia (2007) Gaze vs. Mouse: An evaluation of user experience and planning in problem solving games. Master’s thesis May 2, 2007. Department of Computer Science, University of Joensuu, Finland. Download as PDF