Marie Barret, a masters student at the ITU Copenhagen have now finished her thesis. It evaluates eye typing performance using the ITU Gaze Tracker (low-cost web cam eye tracker) in the Stargazer and GazeTalk interfaces. The thesis in written in Danish (113 pages) but I took the freedom of translating two charts from the thesis found below. The results will be presented in English at the COGAIN 2009 conference, May 26th (session three, track one at 1:50PM) For now I quote the abstract:
"Innovation has facilitated sufficient mainstream technology to build eyetrackers from off-the-shelf-components. Prices for standard eyetrackers start at around € 4000. This thesis describes an experimental usabilty study of gazetyping with a new input device built from a standard web camera without hardware modifications. Cost: € 20. Mainstreaming of assistive technologies holds potential for faster innovation, better service, lower prices and increased accessibility. Off-the-shelf-eyetrackers must be usability competitive to standard eyetrackers in order to be adopted, as eyetracking - even with expensive hardware - presents usability issues. Usability is defined as effectiveness, efficiency and user satisfaction (ISO 9242-11, 1998).
Results from the 2 * 2 factors experiment significantly indicate how the new input device can reach the usability standards of expensive eyetrackers. This study demonstrates that the off-the-shelf-eyetracker can achieve efficiency similar to an expensive eyetracker with no significant effect from any of the tested factors. All four factors have significant impact on effectiveness. A factor that can eliminate the effectiveness difference between the standard hardware and an expensive eyetracker is identified. Another factor can additionally improve effectiveness.
Two gazetyping systems specifically designed for noisy conditions e.g. due to bad calibration and jolting are tested. StarGazer uses a zooming interface and GazeTalk uses large buttons in a static graphic user interface. GazeTalk is significantly more effective than StarGazer. The large onscreen buttons and static interface of GazeTalk with dwell time activation absorb the noise from the input device and typing speeds obtained are comparable to prior research with a regular eyetracker. Clickactivation has for years (Ware & Mikaelian 1987) proved to improve efficiency of gazebased interaction. This experiment demonstrates that this result significantly applies to off-the-shelf eyetrackers as well. The input device relies on the user to compensate for off-set with head movements. The keyboards should support this task with a static graphic user interface." Download thesis as pdf (in Danish)
Wednesday, May 13, 2009
Tuesday, May 12, 2009
BBC News: The future of gadget interaction
Dan Simmons at BBC reports on future technologies from the Science Beyond Fiction 2009 conference in Prague. The news headline includes a section on the GazeCom project who won the 2nd prize for their exhibit "Gaze-contingent displays and interaction". Their website hosts additional demonstrations.
"Gaze tracking is well-established and has been used before now by online advertisers who use it to decide the best place to put an advert. A novel use of the system tracks someone's gaze and brings into focus the area of a video being watched by blurring their peripheral vision.In the future, the whole image could also be panned left or right as the gaze approaches the edge of the screen. Film producers are interested in using the system to direct viewers to particular parts within a movie. However, interacting with software through simply looking will require accurate but unobtrusive eye tracking systems that, so far, remain on the drawing board... The European Commission (EC) is planning to put more cash into such projects. In April it said it would increase its investment in this field from 100m to 170m euros (£89m-£152m) by 2013. " (BBC source ) More information about the EC CORDIS : ICT program.
External link. The BBC reported Dan Simmons tests a system designed to use a driver's peripheral vision to flag up potential dangers on the road. It was recorded at the Science Beyond Fiction conference in Prague.
The GazeCom project involves the following partners:
"Gaze tracking is well-established and has been used before now by online advertisers who use it to decide the best place to put an advert. A novel use of the system tracks someone's gaze and brings into focus the area of a video being watched by blurring their peripheral vision.In the future, the whole image could also be panned left or right as the gaze approaches the edge of the screen. Film producers are interested in using the system to direct viewers to particular parts within a movie. However, interacting with software through simply looking will require accurate but unobtrusive eye tracking systems that, so far, remain on the drawing board... The European Commission (EC) is planning to put more cash into such projects. In April it said it would increase its investment in this field from 100m to 170m euros (£89m-£152m) by 2013. " (BBC source ) More information about the EC CORDIS : ICT program.
External link. The BBC reported Dan Simmons tests a system designed to use a driver's peripheral vision to flag up potential dangers on the road. It was recorded at the Science Beyond Fiction conference in Prague.
The GazeCom project involves the following partners:
0
comments
Labels:
3D,
conference,
inspiration,
navigation,
prototype
ETRA 2010 Call for papers
ETRA 2010 will be the sixth biennial symposium in a series that focuses on all aspects of eye movement research across a wide range of disciplines. The goal of ETRA is to bring together computer scientists, engineers and behavioral scientists in support of a common vision of enhancing eye tracking research and applications. ETRA 2010 is being organized in conjunction with the European Communication by Gaze Interaction (COGAIN) research network that specializes in gaze-based interaction for the benefit of people with physical disabilities.
Update: List of accepted and presented papers.
Symposium Themes
Full papers must be submitted electronically through the ETRA 2010 website and conform to the ACM SIGGRAPH proceedings category 2 format. Full papers submissions can have a maximum length of eight pages. Full papers submissions should be made in double-blind format, hiding authors’ names and affiliations and all references to the authors’ previous work. Those wishing to submit a full paper must submit an abstract in advance to facilitate the reviewing process. Accepted papers will be published in the ETRA 2010 proceedings, and the authors will give a 20 minute oral presentation of the paper at the conference.
Short papers may present work that has smaller scope than a full paper or may present late breaking results. These must be submitted electronically through the ETRA 2010 submission website and conform to the ACM SIGGRAPH proceedings category 3 format. Short paper submissions have a maximum length of four pages (but can be as short as a one-page abstract). Given the time constraints of this type of paper, submissions must be made in camera-ready format including authors' names and affiliations. Accepted submissions will be published in the ETRA 2010 proceedings. Authors will present a poster at the conference, and authors of the most highly rated submissions will give a 10 minute presentation of the paper in a Short Papers session. All submissions will be peer-reviewed by members of an international review panel and members of the program committee. Best Paper Awards will be given to the most highly ranked Full Papers and Short Papers.
Full Papers Deadlines
Update: List of accepted and presented papers.
Symposium Themes
- Advances in Eye Tracking Technology and Data Analysis
Eye tracking systems, calibration algorithms, data analysis techniques, noise reduction, predictive models, 3D POR measurement, low cost and natural light systems.
- Visual Attention and Eye Movement Control
Studies of eye movements in response to natural stimuli, driving studies, web use and usability studies.
- Eye Tracking Applications
Gaze-contingent displays, attentive user interfaces, gaze-based interaction techniques, security systems, multimodal interfaces, augmented and mixed reality systems, ubiquitous computing.
- Special Theme: Eye Tracking and Accessibility
Eye tracking has proved to be an effective means of making computers more accessible when the use of keyboards and mice is hindered by the task itself (such as driving), or by physical disabilities. We invite submissions that explore new methodological strategies, applications, and results that use eye tracking in assistive technologies for access to desktop applications, for environment and mobility control, and for gaze control of games and entertainment..
Full papers must be submitted electronically through the ETRA 2010 website and conform to the ACM SIGGRAPH proceedings category 2 format. Full papers submissions can have a maximum length of eight pages. Full papers submissions should be made in double-blind format, hiding authors’ names and affiliations and all references to the authors’ previous work. Those wishing to submit a full paper must submit an abstract in advance to facilitate the reviewing process. Accepted papers will be published in the ETRA 2010 proceedings, and the authors will give a 20 minute oral presentation of the paper at the conference.
Short papers may present work that has smaller scope than a full paper or may present late breaking results. These must be submitted electronically through the ETRA 2010 submission website and conform to the ACM SIGGRAPH proceedings category 3 format. Short paper submissions have a maximum length of four pages (but can be as short as a one-page abstract). Given the time constraints of this type of paper, submissions must be made in camera-ready format including authors' names and affiliations. Accepted submissions will be published in the ETRA 2010 proceedings. Authors will present a poster at the conference, and authors of the most highly rated submissions will give a 10 minute presentation of the paper in a Short Papers session. All submissions will be peer-reviewed by members of an international review panel and members of the program committee. Best Paper Awards will be given to the most highly ranked Full Papers and Short Papers.
Full Papers Deadlines
- Sep. 30th, 2009 Full Papers abstract submission deadline
- Oct. 7th, 2009 Full Papers submission deadline
- Nov. 13th, 2009 Acceptance notification
- Dec. 2th, 2009 Short Papers submission deadline
- Jan. 8th, 2010 Short Papers acceptance notification
- Jan. 15th, 2010 All camera ready papers due
More information on the ETRA website.
0
comments
Labels:
conference
Thursday, May 7, 2009
Interactive Yarbus at MU, Netherlands
An interactive art exhibition by Christien Meindertsma in the Netherlands opens up for a real time generation of scanpaths to draw images similar to the ones presented in classic Yarbus paper. The main purpose is to illustrate individual differences in the way we look at objects (such as faces, umbrellas, cups etc.) These images are then printed directly and becomes a part of the exhibition. The exhibition runs until June 14th (location: Eindhoven).
2
comments
Labels:
inspiration
Wednesday, May 6, 2009
COGAIN 2009 Program announced
This years Communication By Gaze Interaction conference is held on the 26th of May in Lyngby, Denmark in connection with the VisionDay (a four day event on computer vision). Registration for attending should be made on or before May 14th. Download program as pdf.
Update: the proceedings can be downloaded as pdf.
The program for May 26th
SESSION IV Track 1
Update: the proceedings can be downloaded as pdf.
The program for May 26th
- 08.00 Registration, exhibition, demonstrations, coffee, and rolls
- 09.00 Welcome and introduction (Lars Pallesen, Rector @ DTU)
- 09.10 Eye guidance in natural behaviour (B. W. Tatler)
- 09.50 Achievements and experiences in the course of COGAIN (K. Raiha)
- 10.30 Coffee, exhibition, demonstrations
- 11.00 Joys and sorrows in communicating with gaze (A. Lykke-Larsen)
- 11.30 An introduction to the 17 papers presented in the afternoon
- 12.00 Lunch, exhibition, demonstrations, posters
- 13.30 Eye gaze assessment with a person having complex needs (M. Buchholz & E. Holmqvist)
- 13.50 Performance Evaluation of a Low-Cost Gaze Tracker for Eye Typing (M. Barrett, H. Skovsgaard & J. S. Agustin)
- 14.10 Text Editing by Gaze: Static vs. Dynamic Menus (P. Majaranta, N. Majaranta, G. Daunys & O. Spakov)
- 14.30 Selecting with pie menus (M. H. Urbina, M. Lorenz & A. Huckauf)
- 13.30 Environmental Control Application compliant with Cogain Guidelines (R. Faisal, E. Castellina & F. Corno)
- 13.50 Home and environment control (P. Novák, P. Moc, O. Štepánková & L. Nováková)
- 14.10 A gaze- and cortex-contingent mouse cursor (M. Dorr, C. Rasche & E. Barth)
- 14.30 Optimizing the interoperability between a VOG and a EMG system (M. Ariz, J. Navallas, A. Villanueva, J. S. Agustin, R. Cabeza & M. Tall)
SESSION IV Track 1
- 15.30 Gameplay experience in a gaze interaction game (L. Nacke, S. Stellmach, D. Sasse & C. A. Lindley)
- 15.50 Select commands in 3D game environments by gaze gestures (S. Vickers, H. Istance & A. Hyrskykari)
- 16.10 GazeTrain: A case study of an action oriented gaze-controlled game (L. F. Laursen & B. Ersbøll)
- 16.30 Detecting Search and Rescue Targets in Moving Aerial Images using Eye-gaze (J. Mardell, M. Witkowski & R. Spence)
- 16.50 Feasibility Study for the use of Eye-Movements in Estimation of Answer Correctness (M. Nakayama & Y. Hayashi)
- 15.30 Eye Tracker Connectivity (G. Daunys & V. Vysniauskas)
- 15.50 SW tool supporting customization of eye tracking algorithms (P. Novák & O. Štepánková)
- 16.10 Multimodal Gaze-Based Interaction (S. Trösterer & J. Dzaack)
- 16.30 Gaze Visualization Trends and Techniques (S. Stellmach, L. Nacke, R. Dachselt & C. A. Lindley)
0
comments
Labels:
cogain,
conference
The Dias Eye Tracker (Mardanbeigi, 2009)
Diako Mardanbeigi at the Iran University of Science & Technology introduces the Dias Eye Tracking suite. It is a low-cost solution employing a head mounted setup and comes with a rather extensive suite of applications. The software offers gaze control for playing games and music, viewing images, and text-to-speech using a dwell keyboard. It also offers basic eye movement recording and visualization such as scanpaths. The software is built using Visual Basic 6 and implements various algorithms for eye tracking including a rectangular method, RANSAC or LSQ ellipse/circle fitting. Additionally, there is support tracking one or two glints. The following video demonstrates the hardware and software. Congratulations Daiko on this great work!
1 comments
Labels:
assistive technology,
eye tracker,
game,
low cost
Tuesday, May 5, 2009
Gaze-Augmented Manual Interaction (Bieg, H.J, 2009)
Hans-Joachim Bieg with the HCI Group at the University of Konstanz have investigated gaze augmented interaction on very large display areas. The prototype is running on the 221" Powerwall using a head mounted setup and allows users to select and zoom into an item of interest based on gaze position. An earlier video demonstration of setup can be found here.
"This project will demonstrate a new approach to employing users’ gaze in the context of human-computer interaction. This new approach uses gaze passively in order to improve the speed and precision of manually controlled pointing techniques. Designing such gaze augmented manual techniques requires an understanding of the principles that govern the coordination of hand and eye. This coordination is influenced by situational parameters (task complexity, input device used, etc.), which this project will explore in controlled experiments."
Gaze agumented interaction on the 221" PowerWall
- Bieg, H. 2009. Gaze-augmented manual interaction. In Proceedings of the 27th international Conference Extended Abstracts on Human Factors in Computing Systems (Boston, MA, USA, April 04 - 09, 2009). CHI EA '09. ACM, New York, NY, 3121-3124. DOI= http://doi.acm.org/10.1145/1520340.1520442
0
comments
Labels:
hci,
inspiration,
navigation,
prototype
Sunday, May 3, 2009
Laval VRchive @ Tokyo Metropolitan University
Hidenori Watanave at the Tokyo Metropolitan University have released a brief video demonstrating gaze interaction for the Laval VRchive. The VRchive is a virtual reality environment for navigating media content. The setup is using a standalone Tobii 1750 tracker and a projector. The simple interface allows gaze control through looking at either the top, bottom, left or right areas of the display area as well as winking to perform clicks. Althrough an early version the initial experiments were successful, but the software is unstable and needs further improvements.
0
comments
Labels:
3D,
inspiration,
navigation,
prototype
Friday, May 1, 2009
Low-Cost Gaze Pointing and EMG Clicking
"Some severely disabled people are excluded from using gaze interaction because gaze trackers are usually expensive (above $10.000). In this paper we present a low-cost gaze pointer, which we have tested in combination with a desktop monitor and a wearable display. It is not as accurate as commercial gaze trackers, and walking while pointing with gaze on a wearable display turned out to be particularly difficult. However, in front of a desktop monitor it is precise enough to support communication. Supplemented with a commercial EMG switch it offers a complete hands-free, gaze-and-click control for less than $200."
- San Agustin, J., Hansen, J. P., Hansen, D. W., and Skovsgaard, H. 2009. Low-cost gaze pointing and EMG clicking. In Proceedings of the 27th international Conference Extended Abstracts on Human Factors in Computing Systems (Boston, MA, USA, April 04 - 09, 2009). CHI EA '09. ACM, New York, NY, 3247-3252. DOI= http://doi.acm.org/10.1145/1520340.1520466
Gaze Controlled Driving
This is the paper on using eye trackers for remote robot navigation I had accepted for the CHI09 conference. It has now appeared on the ACM website. Note that the webcam tracker referred to in the paper is the ITU Gaze Tracker in an earlier incarnation. The main issue while using it is that head movements affect the gaze position and creates an offset. This is easier to correct and counterbalance on a static background than moving image (while driving!)
Abstract
Abstract
"We investigate if the gaze (point of regard) can control a remote vehicle driving on a racing track. Five different input devices (on-screen buttons, mouse-pointing low-cost webcam eye tracker and two commercial eye tracking systems) provide heading and speed control on the scene view transmitted from the moving robot. Gaze control was found to be similar to mouse control. This suggests that robots and wheelchairs may be controlled ―hands-free‖ through gaze. Low precision gaze tracking and image transmission delays had noticeable effect on performance."
- Tall, M., Alapetite, A., San Agustin, J., Skovsgaard, H. H., Hansen, J. P., Hansen, D. W., and Møllenbach, E. 2009. Gaze-controlled driving. In Proceedings of the 27th international Conference Extended Abstracts on Human Factors in Computing Systems (Boston, MA, USA, April 04 - 09, 2009). CHI EA '09. ACM, New York, NY, 4387-4392. DOI= http://doi.acm.org/10.1145/1520340.1520671
0
comments
Labels:
gazetracker,
ITU,
navigation,
robot
Subscribe to:
Posts (Atom)