Showing posts with label assistive technology. Show all posts
Showing posts with label assistive technology. Show all posts

Thursday, February 3, 2011

EyeTech Digital Systems

Arizona-based EyeTech Digital Systems offers several interesting eye trackers where the new V1 caught my attention with its extended track-box of 25 x 18 x 50cm. The rather large depth range is provided through a custom auto focus mechanism developed in cooperation with Brigham Young University Dept. of Mechanical Engineering. This makes the device particularly suitable for larger displays such as public displays/digital signage, still the I'd imaging the calibration procedure to remain, ideally you'd want to walk up and interact/collect data automatically without any wizards or intervention. In any case, a larger trackbox is always welcome and it certainly opens up new opportunities. EyeTechs V1 offers 20cm more than most.





Thursday, January 13, 2011

Taiwanese Utechzone, the Spring gaze interaction system

UTechZone a Taiwanese company have launched the Spring gaze interaction system for individuals with ALS or similar conditions. It provides the basic functionality including text entry, email, web, media etc. in a format that reminds much of the MyTobii software. The tracker can be mounted in various ways including wheelchairs and desks with the accessories. A nice feature is the built in TV tuner which is accessible through the gaze interface. The performance of the actual tracking system and accuracy in gaze estimation is unknown, only specified to a 7x4 grid. Track-box is specified to 17cm x 10cm x 15cm with a working range of 55-70 cm.

The system runs on Windows XP and a computer equipped with an Intel Dual Core CPU, 2GB RAM, a 500GB HD combined with a 17" monitor.
Supported languages are Traditional Chinese, Simplified Chinese, English and Japanese. All countries with pretty big markets. Price unknown but probably less than a Tobii. Get the product brochure (pdf). 



Tuesday, June 15, 2010

Speech Dasher: Fast Writing using Speech and Gaze (K. Vertanen & D. MacKay, 2010)

A new version of the Dasher typing interface utilizes speech recognition provided by the CMU PocketSphinx software doubles the typing performance measured in words per minute. From a previous 20 WPM to 40 WPM, close to what a professional keyboard jockey may produce.



Abstract
Speech Dasher allows writing using a combination of speech and a zooming interface. Users first speak what they want to write and then they navigate through the space of recognition hypotheses to correct any errors. Speech Dasher’s model combines information from a speech recognizer, from the
user, and from a letter-based language model. This allows fast writing of anything predicted by the recognizer while also providing seamless fallback to letter-by-letter spelling for words not in the recognizer’s predictions. In a formative user study, expert users wrote at 40 (corrected) words per
minute. They did this despite a recognition word error rate of 22%. Furthermore, they did this using only speech and the direction of their gaze (obtained via an eye tracker).

  
  • Speech Dasher: Fast Writing using Speech and Gaze
    Keith Vertanen and David J.C. MacKay. CHI '10: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, To appear. [Abstract+videos, PDF, BibTeX]

Friday, September 18, 2009

The EyeWriter project

For some time I've been following the EyeWriter project which aims at enabling Tony, who has ALS, to draw graffiti using eye gaze alone. The open source eye tracker is available at Google code and is based on C++, OpenFrameworks and OpenCV. The current version supports basic pupil tracking based on image thresholding and blob detection but they are aiming for remote tracking using IR glints. Keep up the great work guys!

The Eyewriter from Evan Roth on Vimeo.

eyewriter tracking software walkthrough from thesystemis on Vimeo.

More information is found at http://fffff.at/eyewriter/

Tuesday, August 18, 2009

COGAIN Student Competition Results

Lasse Farnung Laursen, a Ph.D student with the Department of Informatics and Mathematical Modeling at the Technical University of Denmark, won this years COGAIN student competition with the leisure application called GazeTrain.

"GazeTrain (illustrated in the screenshot below) is an action oriented puzzle game, that can be controlled by eye movements. In GazeTrain you must guide a train by placing track tiles in front of it. As you guide the train, you must collect various cargo and drop them off at the nearest city thereby earning money. For further details regarding how to play the game, we encourage you to read the tutorial accessible from the main menu. The game is quite customizable as the dwell time and several other parameters can be adjusted to best suit your play-style." (Source)

The GazeTrain game.

Runner ups, sharing the second place were

Music Editor, developed by Ainhoa Yera Gil, Public University of Navarre, Spain. Music Editor is a gaze-operated application that allows the user to compose, edit and play music by eye movements. The reviewers appreciated it that "a user can not only play but can actually create something" and that "Music Editor is well suited for gaze control".

Gaze Based Sudoku, developed by Juha Hjelm and Mari Pesonen, University of Tampere, Finland. The game can be operated by eye movements and it has three difficulty levels. Reviewers especially appreciated how "the separation between viewing and controlling and between sudoku grid and number selection panel is solved" and that the game "has no time constraints" so it is "relaxing" to play.

Tuesday, August 11, 2009

ALS Society of British Columbia announces Engineering Design Awards (Canadian students only)

"The ALS Society of British Columbia has established three Awards to encourage and recognize innovation in technology to substantially improve the quality of life of people living with ALS (Amyotrophic Lateral Sclerosis, also known as Lou Gehrig’s Disease). Students at the undergraduate or graduate level in engineering or a related discipline at a post-secondary institution in British Columbia or elsewhere in Canada are eligible for the Awards. Students may be considered individually or as a team. Mentor Awards may also be given to faculty supervising students who win awards" (see Announcement)


Project ideas:
  • Low-cost eye tracker
    • Issue: Current commercial eye-gaze tracking systems cost thousands to tens of thousands of dollars. The high cost of eye-gaze trackers prevents potential users from accessing eye- gaze tracking tools. The hardware components required for eye-gaze tracking do not justify the price and a lower-cost alternative is desirable. Webcams may be used for low-cost imaging, along with simple infrared diodes for system lighting. Alternatively, visible light systems may also be investigated. Opensource eye-gaze tracking software is also available. (ed: ITU GazeTracker, OpenEyes, Track Eye, OpenGazer and MyEye (free, no source)
    • Goal: The goal of this design project is to develop a low-cost and usable eye-gaze tracking system based on simple commercial-of-the-shelf hardware.
    • Deliverables: A working prototype of a functional, low-cost (< $200), eye-gaze tracking system.
  • Eye-glasses compensation
    • Deliverables: A working prototype of a functional, low-cost (< $200), eye-gaze tracking system
    • Issue: The use of eye-glasses can cause considerable problems in eye-gaze tracking. The issue stems from reflections off the eye-glasses due to the use of controlled infrared lighting (on and off axis light sources) used to highlight features of the face. The key features of interest are the pupils and glints (or reflections of the surface of the cornea). Incorrectly identifying the pupils and glints then results in invalid estimation of the point-of-gaze.
    • Goal: The goal of this design project is to develop techniques for either: 1) avoiding image corruption with eye-glasses on a commercial eye-gaze tracker, or 2) developing a controlled lighting scheme to ensure valid pupil and glints identification are identified in the presence of eye-glasses.
    • Deliverables: Two forms of deliverables are possible: 1) A working prototype illustrating functional eye-gaze tracking in the presence of eye-glasses with a commercial eye-gaze tracker, or 2) A working prototype illustrating accurate real-time identification of the pupil and glints using controlled infrared lighting (on and off axis light sources) in the presence of eye-glasses.
  • Innovative selection with ALS and eye gaze
    • Issue: As mobility steadily decreases in the more advanced stages of ALS, alternative techniques for selection are required. Current solutions include head switches, sip and puff switches and dwell time activation depending on the degree of mobility loss to name a few. The use of dwell time requires no mobility other than eye motion, however, this technique suffers from ‘lag’ in that the user must wait the dwell time duration for each selection, as well as the ‘midas touch’ problem in which unintended selection if the gaze point is stationary for too long.
    • Goal: The goal of this design project is to develop a technique for improved selection with eye-gaze for individuals with only eye-motion available. Possible solutions may involve novel HCI designs for interaction, including various adaptive and predictive technologies, the consideration of contextual cues, and the introduction of ancillary inputs, such as EMG, EEG.
    • Deliverables: A working prototype illustrating eye-motion only selection with a commercial eye-gaze tracking system.
  • Novel and valuable eye-gaze tracking applications and application enhancements
    • Issue: To date, relatively few gaze-tracking applications have been developed. These include relatively simplistic applications such as the tedious typing of words, and even in such systems, little is done to ease the effort required, e.g., systems typically do not allow for the saving and reuse of words and sentences.
    • Goal: The goal of this design project is to develop one or more novel applications or application enhancements that take gaze as input, and that provide new efficiencies or capabilities that could significantly improve the quality of life of those living with ALS.
    • Deliverables: A working prototype illustrating one or more novel applications that take eye-motion as an input. The prototype must be developed and implemented to the extent that an evaluation of the potential efficiencies and/or reductions in effort can be evaluated by persons living with ALS and others on an evaluation panel.

    See the Project Ideas for more information. For contact information see page two of the announcement.

Thursday, August 6, 2009

Päivi Majaranta PhD Thesis on Text Entry by Eye Gaze

The most complete publication on gaze typing is now available as Päivi Majaranta at the University of Tampere have successfully defended her PhD thesis. It summarizes previous work and discusses/exemplifies important topics such as word prediction, layout, feedback and user aspects. The material is presented in a straight forward manner with a clear structure and excellent illustrations. It will without doubt be useful for anyone who is about to design and develop a gaze based text entry interface. Congratulations Päivi for such an well written thesis.



Monday, June 29, 2009

Video from COGAIN2009

John Paulin Hansen has posted a video showing some highlights from the annual COGAIN conference. It demonstrates three available gaze interaction solutions, the COGAIN GazeTalk interface, Tobii Technologies MyTobii and Alea Technologies IG-30. These interfaces relies on dwell-activated on-screen keyboards ( i.e. same procedure as last year).


Monday, June 1, 2009

COGAIN 2009 Proceedings now online

There is little reason to doubt the vitality of the COGAIN network. This years proceedings presents an impressive 18 papers spread out over one hundred pages. It covers a wide range of areas from low-cost eye tracking, text entry, gaze input for gaming, multimodal interaction to environment control, clinical assessments and case studies. Unfortunately I was unable to attend the event this year (recently relocated) but with the hefty proceedings being available online there is plenty of material to be read through (program and links to authors here) Thanks to Arantxa Villanuevua, John Paulin Hansen and Bjarne Kjaer Ersboll for the editorial effort.

Wednesday, May 13, 2009

Hi –fi eyetracking with a lo-fi eyetracker: An experimental usability study of an eyetracker built from a standard web camara (Barret, M., 2009)

Marie Barret, a masters student at the ITU Copenhagen have now finished her thesis. It evaluates eye typing performance using the ITU Gaze Tracker (low-cost web cam eye tracker) in the Stargazer and GazeTalk interfaces. The thesis in written in Danish (113 pages) but I took the freedom of translating two charts from the thesis found below. The results will be presented in English at the COGAIN 2009 conference, May 26th (session three, track one at 1:50PM) For now I quote the abstract:

"Innovation has facilitated sufficient mainstream technology to build eyetrackers from off-the-shelf-components. Prices for standard eyetrackers start at around € 4000. This thesis describes an experimental usabilty study of gazetyping with a new input device built from a standard web camera without hardware modifications. Cost: € 20. Mainstreaming of assistive technologies holds potential for faster innovation, better service, lower prices and increased accessibility. Off-the-shelf-eyetrackers must be usability competitive to standard eyetrackers in order to be adopted, as eyetracking - even with expensive hardware - presents usability issues. Usability is defined as effectiveness, efficiency and user satisfaction (ISO 9242-11, 1998).

Results from the 2 * 2 factors experiment significantly indicate how the new input device can reach the usability standards of expensive eyetrackers. This study demonstrates that the off-the-shelf-eyetracker can achieve efficiency similar to an expensive eyetracker with no significant effect from any of the tested factors. All four factors have significant impact on effectiveness. A factor that can eliminate the effectiveness difference between the standard hardware and an expensive eyetracker is identified. Another factor can additionally improve effectiveness.

Two gazetyping systems specifically designed for noisy conditions e.g. due to bad calibration and jolting are tested. StarGazer uses a zooming interface and GazeTalk uses large buttons in a static graphic user interface. GazeTalk is significantly more effective than StarGazer. The large onscreen buttons and static interface of GazeTalk with dwell time activation absorb the noise from the input device and typing speeds obtained are comparable to prior research with a regular eyetracker. Clickactivation has for years (Ware & Mikaelian 1987) proved to improve efficiency of gazebased interaction. This experiment demonstrates that this result significantly applies to off-the-shelf eyetrackers as well. The input device relies on the user to compensate for off-set with head movements. The keyboards should support this task with a static graphic user interface." Download thesis as pdf (in Danish)

Wednesday, May 6, 2009

The Dias Eye Tracker (Mardanbeigi, 2009)

Diako Mardanbeigi at the Iran University of Science & Technology introduces the Dias Eye Tracking suite. It is a low-cost solution employing a head mounted setup and comes with a rather extensive suite of applications. The software offers gaze control for playing games and music, viewing images, and text-to-speech using a dwell keyboard. It also offers basic eye movement recording and visualization such as scanpaths. The software is built using Visual Basic 6 and implements various algorithms for eye tracking including a rectangular method, RANSAC or LSQ ellipse/circle fitting. Additionally, there is support tracking one or two glints. The following video demonstrates the hardware and software. Congratulations Daiko on this great work!


Monday, November 24, 2008

Our gaze controlled robot on the DR News

The Danish National Television "TV-Avisen" episode on our gaze controlled robot was broadcasted Friday 22nd November for the nine o´ clock news. Alternative versions (resolution) of the video clip can be found at the DR site.








View video

Tuesday, June 3, 2008

Eye typing at the Bauhaus University of Weimar

The Psychophysiology and Perception group, part of the faculty of Media at the Bauhaus University of Weimar are conducting research on gaze based text entry. Their past research projects include the Qwerty on-screen dwell based keyboard, IWrite, pEYEWrite and StarWrite. Thanks to Mario Urbina for notification.

QWRTY
"Qwerty is based on dwell time selection. Here the user has to stare for 500 ms a determinate character to select it. QWERTY served us, as comparison base line for the new eye typing systems. It was implemented in C++ using QT libraries."







IWrite
"A simple way to perform a selection based on saccadic movement is to select an item by looking at it and confirm its selection by gazing towards a defined place or item. Iwrite is based on screen buttons. We implemented an outer frame as screen button. That is to say, characters are selected by gazing towards the outer frame of the application. This lets the text window in the middle of the screen for comfortable and safe text review. The order of the characters, parallel to the display borders, should reduce errors like the unintentional selection of items situated in the path as one moves across to the screen button.The strength of this interface lies on its simplicity of use. Additionally, it takes full advantage of the velocity of short saccade selection. Number and symbol entry mode was implemented for this editor in the lower frame. Iwrite was implemented in C++ using QT libraries."
PEYEWrite
"Pie menus have already been shown to be powerful menus for mouse or stylus control. They are two-dimensional, circular menus, containing menu items displayed as pie-formed slices. Finding a trade-off between user interfaces for novice and expert users is one of the main challenges in the design of an interface, especially in gaze control, as it is less conventional and utilized than input controlled by hand. One of the main advantages of pie menus is that interaction is very easy to learn. A pie menu presents items always in the same position, so users can match predetermined gestures with their corresponding actions. We therefore decided to transfer pie menus to gaze control and try it out for an eye typing approach. We designed the Pie menu for six items and two depth layers. With this configuration we can present (6 x 6) 36 items. The first layer contains groups of five letters ordered in pie slices.."

StarWrite
In StarWrite, selection is also based on saccadic movements to avoid dwell times. The idea of StarWrite is to combine eye typing movements with feedback. Users, mostly novices, tend to look to the text field after each selection to check what has been written. Here letters are typed by dragging them into the text field. This provides instantaneous visual feedback and should spare checking saccades towards text field. When a character is fixated, both it and its neighbors are highlighted and enlarged in order to facilitate the character selection. In order to use x- and y-coordinates for target selection, letters were arranged alphabetically on a half-circle in the upper part of the monitor. The text window appeared in the lower field. StarWrite provides a lower case as well, upper case, and numerical entry modes, that can be switched by fixating for 500 milliseconds the corresponding buttons, situated on the lower part of the application. There are also placed the space, delete and enter keys, which are driven by a 500 ms dwell time too. StarWrite was implemented in C++ using OpenGL libraries for the visualization."

Associated publications
  • Huckauf, A. and Urbina, M. H. 2008. Gazing with pEYEs: towards a universal input for various applications. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications (Savannah, Georgia, March 26 - 28, 2008). ETRA '08. ACM, New York, NY, 51-54. [URL] [PDF] [BIB]
  • Urbina, M. H. and Huckauf, A. 2007. Dwell time free eye typing approaches. In Proceedings of the 3rd Conference on Communication by Gaze Interaction - COGAIN 2007, September 2007, Leicester, UK, 65--70. Available online at http://www.cogain.org/cogain2007/COGAIN2007Proceedings.pdf [PDF] [BIB]
  • Huckauf, A. and Urbina, M. 2007. Gazing with pEYE: new concepts in eye typing. In Proceedings of the 4th Symposium on Applied Perception in Graphics and Visualization (Tubingen, Germany, July 25 - 27, 2007). APGV '07, vol. 253. ACM, New York, NY, 141-141. [URL] [PDF] [BIB]
  • Urbina, M. H. and Huckauf, A. 2007. pEYEdit: Gaze-based text entry via pie menus. In Conference Abstracts. 14th European Conference on Eye Movements ECEM2007. Kliegl, R. & Brenstein, R. (Eds.) (2007), 165-165.

Tuesday, May 6, 2008

Gaze interaction hits mainstream news

The New Scientist technology section posted an article on the Stephen Vickers work at De Montford University for the eye controlled version of World of Warcraft which I wrote about two months ago (see post)
Update: The New Scientist post caused rather extensive discussions on Slashdot, with more than
140 entries.

Great to see mainstream interest of gaze driven interaction. Gaming is truly one area where there is a huge potential, but it also depends on more accessible eye trackers. There is a movement for open source based eye tracking but the robustness for everyday usage is still remains at large. The system Stephen Vickers have developed is using the Tobii X120 eye tracker which is clearly out of range for all but the small group of users whom are granted financial support for their much needed assistive technology.

Have faith
In general, all new technology initially comes at a high cost due to intensive research and development but over time becomes accessible for the larger population. As an example, not many could imagine that satellite GPS navigation would be commonplace and really cheap a decade or two ago. Today mass-collaboration on the net is really happening making the rate of technology development exponential. Make sure to watch Google Techtalk Don Tapscott on Wikinomics.

Tuesday, April 15, 2008

Alea Technologies

The German firm Alea Technologies offers a solution called IntelliGaze consisting of a remote based eye tracking system as well as a software suite. The system is designed to be a flexible solution in terms of both hardware and software. The eye tracker is stand-alone and can be used to create a customizable setup when it is combined with a various displays. Additionally, they provide API´s for application developments.

A clear usage for their technology is users with disabilities such as ALS. The software contains a "desktop" system which acts as a launcher for other applications (Windows natives, Grid, Cogain). In general, they seem to target Tobii Technologies who have been very successful with their MyTobii application running on the P10 eye tracker. The game is on.


Quote:
"The ease of setup and intuitive operation bring a completely new degree of freedom to patients who had to rely on older technologies like manual scanning or cumbersome pointing devices. By using the latest camera technology and very sophisticated image processing and calibration methods, the IG-30 system is far superior to alternative gaze input systems at a similar price point. The modular structure of the system allows the patients with a degenerative disease the continued use of their touch-screen or scanning software package. Using existing computers and monitors, the IG-30 system can also be easily integrated into an existing PC setup. The economic pricing of the IntelliGazeTM systems opens this high-tech solution to a much wider range of users."

Tracking technology
Hybrid infrared video eye- & head-tracking Binocular & monocular tracking
Working Volume
centered at 600 mm distance
300 x 200 x 200 mm3 [WxHxD]
Accuracy , static 0.5°, typical
Accuracy , over full working volume 1°, typical
Sampling Rate 50 Hz
Max. head-movement velocity 15 cm/s
Recovery-time after tracking loss
(head was too fast or moved out of range)
40 ms
System Dimensions ca. 300 x 45 x 80 mm3 [WxHxD]
Mounting Options on monitor via VESA-adapter
on Tablet-PC via customized interfaces
System Weight ca. 1,2 kg.

Thursday, March 27, 2008

RApid GAze-Based Interaction Techniques (RAGABITS)


Stephen Vickers at the Computer Human Interaction Research Group at the De Montfort University, Uk have developed interaction techniques that allows gaze based control of several popular online virtual worlds such as World of Warcraft or Second Life. This research will be presented at ETRA 2008, US under the title RAGABITS (RApid GAze-Based Interaction Techniques) and is espcially intented for users with severe motor impairments.

Selection method seems stable. None of the usual jitter can be seen. Nice!




Quote from http://www.ioct.dmu.ac.uk/projects/eyegaze.html

"Online virtual worlds and games (MMORPG's) have much to offer users with severe motor disabilities. It gives this user group the opportunity as entirely able-bodied to others in the virtual world. if they so wish. The extent to which a user has to reveal their disability becomes a privacy issue. Many of the avatars in Second Life appear as stylized versions of the users that control them and that stylization is the choice of the user. This choice is equally appropriate for disabled users. While the appearance of the user's avatar may not reveal the disability of the person that controls it, the behavior and speed or interaction in the world may do.

Many users with severe motor impairments may not be able to operate a keyboard or hand mouse and may also struggle with speech and head movement. Eye gaze is one method of interaction that has been used successfully in enabling access to desktop environments. However, simply emulating a mouse using eye gaze is not sufficient for interaction in online virtual worlds and the users privacy can be exposed unless efficient gaze-based interaction techniques, appropriate to activities in on-line worlds and games can be provided.

This genre of gaming (MMORPG's) is constantly evolving and regardless of the aim of the game they all involve common tasks such as, avatar creation, social interaction (chatting, IM), interaction with in world objects (pick up, open, shoot etc), navigating and walking around the environment. Our research involves analyzing these common tasks so that suitable gaze based interaction techniques to support them can be used in place of a mouse and keyboard. These will have different performance/effort trade-offs, and will include extended mouse/joystick emulation, gaze gestures, toolglasses and gaze-aware in-world objects. These techniques need to be integrated into a coherent and efficient user interface suited to the needs of an individual user with a particular disability. The research aims to model tasks inherent in using these worlds so that predictions can be made about the most appropriate gaze based interaction techniques to use. When these have been identified, they can be assembled into a front end or user interface. One possible outcome could be a software device for automatic configuration of a gaze-control interface for new games, which could use knowledge of a specific user's disability and the eye tracking equipment that they have."

Wednesday, February 20, 2008

Inspiration: ZoomNavigator (Skovsgaard, 2008)

Following up on the StartGazer text entry interface presented in my previous post, another approach to using zooming interfaces is employed in the ZoomNavigator (Skovsgaard, 2008) It addresses the well known issue of using gaze as input on traditional desktop systems, namely inaccuracy and jitter. Interesting solution which relies on dwell-time execution compared to the EyePoint system (Kumar&Winograd, 2007) which is described in the next post.

Abstract
The goal of this research is to estimate the maximum amount of noise of a pointing device that still makes interaction with a Windows interface possible. This work proposes zoom as an alternative activation method to the more well-known interaction methods (dwell and two-step-dwell activation). We present a magnifier called ZoomNavigator that uses the zoom principle to interact with an interface. Selection by zooming was tested with white noise in a range of 0 to 160 pixels in radius on an eye tracker and a standard mouse. The mouse was found to be more accurate than the eye tracker. The zoom principle applied allowed successful interaction with the smallest targets found in the Windows environment even with noise up to about 80 pixels in radius. The work suggests that the zoom interaction gives the user a possibility to make corrective movement during activation time eliminating the waiting time found in all types of dwell activations. Furthermore zooming can be a promising way to compensate for inaccuracies on low-resolution eye trackers or for instance if people have problems controlling the mouse due to hand tremors.


The sequence of images are screenshots from ZoomNavigator showing
a zoom towards a Windows file called ZoomNavigator.exe.

The principles of ZoomNavigator are shown in the figure above. Zooming is used to focus on the attended object and eventually make a selection (unambiguous action). ZoomNavigator allows actions similar to those found in a conventional mouse. (Skovsgaard, 2008) The system is described in a conference paper titled "Estimating acceptable noise-levels on gaze and mouse selection by zooming" Download paper (pdf)

Two-step zoom
The two-step zoom activation is demonstrated in the video below by IT University of Copenhagen (ITU) research director prof. John Paulin Hansen. Notice how the error rate is reduced by the zooming style of interaction, making it suitable for applications with need for detailed discrimination. It might be slower but error rates drops significantly.



"Dwell is the traditional way of making selections by gaze. In the video we compare dwell to magnification and zoom. While the hit-rate is 10 % with dwell on a 12 x 12 pixels target, it is 100 % for both magnification and zoom. Magnification is a two-step process though, while zoom only takes on selection. In the experiment, the initiation of a selection is done by pressing the spacebar. Normally, the gaze tracking system will do this automatically when the gaze remains within a limited area for more than approx. 100 ms"

For more information see the publications of the ITU.

Inspiration: COGAIN

Much of the developments seen in the field of gaze interaction stems from the assistive technology field where users whom are unable to use regular computer interfaces are provided tools to empower their everyday life in a wide range of activities such as communication, entertainment, home control etc. For example they can use the eye to type words and sentences which then are synthetically translated into spoken language by software, thus enabling communication beyond blinking. A major improvement in the quality of life.

"COGAIN (Communication by Gaze Interaction) integrates cutting-edge expertise on interface technologies for the benefit of users with disabilities. COGAIN belongs to the eInclusion strategic objective of IST. COGAIN focuses on improving the quality of life for those whose life is impaired by motor-control disorders, such as ALS or CP. COGAIN assistive technologies will empower the target group to communicate by using the capabilities they have and by offering compensation for capabilities that are deteriorating. The users will be able to use applications that help them to be in control of the environment, or achieve a completely new level of convenience and speed in gaze-based communication. Using the technology developed in the network, text can be created quickly by eye typing, and it can be rendered with the user's own voice. In addition to this, the network will provide entertainment applications for making the life of the users more enjoyable and more equal. COGAIN believes that assistive technologies serve best by providing applications that are both empowering and fun to use."

A short introduction by Dr Richard Bates, a research fellow at the School of Computing Sciences at the De Montfort University in Leicester, can be downloaded either as presentation slides or paper.

The COGAIN network is a rich source of information on gaze interaction. A set of tools developed within the network has been made publicly software available for download. Make sure to check out the video demonstations of various gaze interaction tools.

Participating organizations within the COGAIN network.