Thursday, February 3, 2011
EyeTech Digital Systems
Thursday, January 13, 2011
Taiwanese Utechzone, the Spring gaze interaction system
The system runs on Windows XP and a computer equipped with an Intel Dual Core CPU, 2GB RAM, a 500GB HD combined with a 17" monitor.
Supported languages are Traditional Chinese, Simplified Chinese, English and Japanese. All countries with pretty big markets. Price unknown but probably less than a Tobii. Get the product brochure (pdf).
Tuesday, June 15, 2010
Speech Dasher: Fast Writing using Speech and Gaze (K. Vertanen & D. MacKay, 2010)
Abstract
Speech Dasher allows writing using a combination of speech and a zooming interface. Users first speak what they want to write and then they navigate through the space of recognition hypotheses to correct any errors. Speech Dasher’s model combines information from a speech recognizer, from the
user, and from a letter-based language model. This allows fast writing of anything predicted by the recognizer while also providing seamless fallback to letter-by-letter spelling for words not in the recognizer’s predictions. In a formative user study, expert users wrote at 40 (corrected) words per
minute. They did this despite a recognition word error rate of 22%. Furthermore, they did this using only speech and the direction of their gaze (obtained via an eye tracker).
- Speech Dasher: Fast Writing using Speech and Gaze
Keith Vertanen and David J.C. MacKay. CHI '10: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, To appear. [Abstract+videos, PDF, BibTeX]
Friday, September 18, 2009
The EyeWriter project
The Eyewriter from Evan Roth on Vimeo.
eyewriter tracking software walkthrough from thesystemis on Vimeo.
More information is found at http://fffff.at/eyewriter/
Tuesday, August 18, 2009
COGAIN Student Competition Results
"GazeTrain (illustrated in the screenshot below) is an action oriented puzzle game, that can be controlled by eye movements. In GazeTrain you must guide a train by placing track tiles in front of it. As you guide the train, you must collect various cargo and drop them off at the nearest city thereby earning money. For further details regarding how to play the game, we encourage you to read the tutorial accessible from the main menu. The game is quite customizable as the dwell time and several other parameters can be adjusted to best suit your play-style." (Source)
Runner ups, sharing the second place were
Music Editor, developed by Ainhoa Yera Gil, Public University of Navarre, Spain. Music Editor is a gaze-operated application that allows the user to compose, edit and play music by eye movements. The reviewers appreciated it that "a user can not only play but can actually create something" and that "Music Editor is well suited for gaze control".
Gaze Based Sudoku, developed by Juha Hjelm and Mari Pesonen, University of Tampere, Finland. The game can be operated by eye movements and it has three difficulty levels. Reviewers especially appreciated how "the separation between viewing and controlling and between sudoku grid and number selection panel is solved" and that the game "has no time constraints" so it is "relaxing" to play.
Tuesday, August 11, 2009
ALS Society of British Columbia announces Engineering Design Awards (Canadian students only)
Project ideas:
- Low-cost eye tracker
- Issue: Current commercial eye-gaze tracking systems cost thousands to tens of thousands of dollars. The high cost of eye-gaze trackers prevents potential users from accessing eye- gaze tracking tools. The hardware components required for eye-gaze tracking do not justify the price and a lower-cost alternative is desirable. Webcams may be used for low-cost imaging, along with simple infrared diodes for system lighting. Alternatively, visible light systems may also be investigated. Opensource eye-gaze tracking software is also available. (ed: ITU GazeTracker, OpenEyes, Track Eye, OpenGazer and MyEye (free, no source)
- Goal: The goal of this design project is to develop a low-cost and usable eye-gaze tracking system based on simple commercial-of-the-shelf hardware.
- Deliverables: A working prototype of a functional, low-cost (< $200), eye-gaze tracking system.
- Issue: Current commercial eye-gaze tracking systems cost thousands to tens of thousands of dollars. The high cost of eye-gaze trackers prevents potential users from accessing eye- gaze tracking tools. The hardware components required for eye-gaze tracking do not justify the price and a lower-cost alternative is desirable. Webcams may be used for low-cost imaging, along with simple infrared diodes for system lighting. Alternatively, visible light systems may also be investigated. Opensource eye-gaze tracking software is also available. (ed: ITU GazeTracker, OpenEyes, Track Eye, OpenGazer and MyEye (free, no source)
- Eye-glasses compensation
- Deliverables: A working prototype of a functional, low-cost (< $200), eye-gaze tracking system
- Issue: The use of eye-glasses can cause considerable problems in eye-gaze tracking. The issue stems from reflections off the eye-glasses due to the use of controlled infrared lighting (on and off axis light sources) used to highlight features of the face. The key features of interest are the pupils and glints (or reflections of the surface of the cornea). Incorrectly identifying the pupils and glints then results in invalid estimation of the point-of-gaze.
- Goal: The goal of this design project is to develop techniques for either: 1) avoiding image corruption with eye-glasses on a commercial eye-gaze tracker, or 2) developing a controlled lighting scheme to ensure valid pupil and glints identification are identified in the presence of eye-glasses.
- Deliverables: Two forms of deliverables are possible: 1) A working prototype illustrating functional eye-gaze tracking in the presence of eye-glasses with a commercial eye-gaze tracker, or 2) A working prototype illustrating accurate real-time identification of the pupil and glints using controlled infrared lighting (on and off axis light sources) in the presence of eye-glasses.
- Innovative selection with ALS and eye gaze
- Issue: As mobility steadily decreases in the more advanced stages of ALS, alternative techniques for selection are required. Current solutions include head switches, sip and puff switches and dwell time activation depending on the degree of mobility loss to name a few. The use of dwell time requires no mobility other than eye motion, however, this technique suffers from ‘lag’ in that the user must wait the dwell time duration for each selection, as well as the ‘midas touch’ problem in which unintended selection if the gaze point is stationary for too long.
- Goal: The goal of this design project is to develop a technique for improved selection with eye-gaze for individuals with only eye-motion available. Possible solutions may involve novel HCI designs for interaction, including various adaptive and predictive technologies, the consideration of contextual cues, and the introduction of ancillary inputs, such as EMG, EEG.
- Deliverables: A working prototype illustrating eye-motion only selection with a commercial eye-gaze tracking system.
- Novel and valuable eye-gaze tracking applications and application enhancements
- Issue: To date, relatively few gaze-tracking applications have been developed. These include relatively simplistic applications such as the tedious typing of words, and even in such systems, little is done to ease the effort required, e.g., systems typically do not allow for the saving and reuse of words and sentences.
- Goal: The goal of this design project is to develop one or more novel applications or application enhancements that take gaze as input, and that provide new efficiencies or capabilities that could significantly improve the quality of life of those living with ALS.
- Deliverables: A working prototype illustrating one or more novel applications that take eye-motion as an input. The prototype must be developed and implemented to the extent that an evaluation of the potential efficiencies and/or reductions in effort can be evaluated by persons living with ALS and others on an evaluation panel.
See the Project Ideas for more information. For contact information see page two of the announcement.
Thursday, August 6, 2009
Päivi Majaranta PhD Thesis on Text Entry by Eye Gaze
- Majaranta, P. (2009) Text Entry by Eye Gaze. Dissertations in Interactive Technology, number 11, University of Tampere (ISBN 978-951-44-7786-7). Also available in Acta Electronica Universitatis Tamperensis; 869 (978-951-44-7787-4).
Monday, June 29, 2009
Video from COGAIN2009
Monday, June 1, 2009
COGAIN 2009 Proceedings now online
Wednesday, May 13, 2009
Hi –fi eyetracking with a lo-fi eyetracker: An experimental usability study of an eyetracker built from a standard web camara (Barret, M., 2009)
"Innovation has facilitated sufficient mainstream technology to build eyetrackers from off-the-shelf-components. Prices for standard eyetrackers start at around € 4000. This thesis describes an experimental usabilty study of gazetyping with a new input device built from a standard web camera without hardware modifications. Cost: € 20. Mainstreaming of assistive technologies holds potential for faster innovation, better service, lower prices and increased accessibility. Off-the-shelf-eyetrackers must be usability competitive to standard eyetrackers in order to be adopted, as eyetracking - even with expensive hardware - presents usability issues. Usability is defined as effectiveness, efficiency and user satisfaction (ISO 9242-11, 1998).
Results from the 2 * 2 factors experiment significantly indicate how the new input device can reach the usability standards of expensive eyetrackers. This study demonstrates that the off-the-shelf-eyetracker can achieve efficiency similar to an expensive eyetracker with no significant effect from any of the tested factors. All four factors have significant impact on effectiveness. A factor that can eliminate the effectiveness difference between the standard hardware and an expensive eyetracker is identified. Another factor can additionally improve effectiveness.
Two gazetyping systems specifically designed for noisy conditions e.g. due to bad calibration and jolting are tested. StarGazer uses a zooming interface and GazeTalk uses large buttons in a static graphic user interface. GazeTalk is significantly more effective than StarGazer. The large onscreen buttons and static interface of GazeTalk with dwell time activation absorb the noise from the input device and typing speeds obtained are comparable to prior research with a regular eyetracker. Clickactivation has for years (Ware & Mikaelian 1987) proved to improve efficiency of gazebased interaction. This experiment demonstrates that this result significantly applies to off-the-shelf eyetrackers as well. The input device relies on the user to compensate for off-set with head movements. The keyboards should support this task with a static graphic user interface." Download thesis as pdf (in Danish)
Wednesday, May 6, 2009
The Dias Eye Tracker (Mardanbeigi, 2009)
Monday, November 24, 2008
Our gaze controlled robot on the DR News
View video
Tuesday, June 3, 2008
Eye typing at the Bauhaus University of Weimar
"Qwerty is based on dwell time selection. Here the user has to stare for 500 ms a determinate character to select it. QWERTY served us, as comparison base line for the new eye typing systems. It was implemented in C++ using QT libraries."
IWrite
"A simple way to perform a selection based on saccadic movement is to select an item by looking at it and confirm its selection by gazing towards a defined place or item. Iwrite is based on screen buttons. We implemented an outer frame as screen button. That is to say, characters are selected by gazing towards the outer frame of the application. This lets the text window in the middle of the screen for comfortable and safe text review. The order of the characters, parallel to the display borders, should reduce errors like the unintentional selection of items situated in the path as one moves across to the screen button.The strength of this interface lies on its simplicity of use. Additionally, it takes full advantage of the velocity of short saccade selection. Number and symbol entry mode was implemented for this editor in the lower frame. Iwrite was implemented in C++ using QT libraries."
"Pie menus have already been shown to be powerful menus for mouse or stylus control. They are two-dimensional, circular menus, containing menu items displayed as pie-formed slices. Finding a trade-off between user interfaces for novice and expert users is one of the main challenges in the design of an interface, especially in gaze control, as it is less conventional and utilized than input controlled by hand. One of the main advantages of pie menus is that interaction is very easy to learn. A pie menu presents items always in the same position, so users can match predetermined gestures with their corresponding actions. We therefore decided to transfer pie menus to gaze control and try it out for an eye typing approach. We designed the Pie menu for six items and two depth layers. With this configuration we can present (6 x 6) 36 items. The first layer contains groups of five letters ordered in pie slices.."
StarWrite
In StarWrite, selection is also based on saccadic movements to avoid dwell times. The idea of StarWrite is to combine eye typing movements with feedback. Users, mostly novices, tend to look to the text field after each selection to check what has been written. Here letters are typed by dragging them into the text field. This provides instantaneous visual feedback and should spare checking saccades towards text field. When a character is fixated, both it and its neighbors are highlighted and enlarged in order to facilitate the character selection. In order to use x- and y-coordinates for target selection, letters were arranged alphabetically on a half-circle in the upper part of the monitor. The text window appeared in the lower field. StarWrite provides a lower case as well, upper case, and numerical entry modes, that can be switched by fixating for 500 milliseconds the corresponding buttons, situated on the lower part of the application. There are also placed the space, delete and enter keys, which are driven by a 500 ms dwell time too. StarWrite was implemented in C++ using OpenGL libraries for the visualization."
Associated publications
- Huckauf, A. and Urbina, M. H. 2008. Gazing with pEYEs: towards a universal input for various applications. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications (Savannah, Georgia, March 26 - 28, 2008). ETRA '08. ACM, New York, NY, 51-54. [URL] [PDF] [BIB]
- Urbina, M. H. and Huckauf, A. 2007. Dwell time free eye typing approaches. In Proceedings of the 3rd Conference on Communication by Gaze Interaction - COGAIN 2007, September 2007, Leicester, UK, 65--70. Available online at http://www.cogain.org/cogain2007/COGAIN2007Proceedings.pdf [PDF] [BIB]
- Huckauf, A. and Urbina, M. 2007. Gazing with pEYE: new concepts in eye typing. In Proceedings of the 4th Symposium on Applied Perception in Graphics and Visualization (Tubingen, Germany, July 25 - 27, 2007). APGV '07, vol. 253. ACM, New York, NY, 141-141. [URL] [PDF] [BIB]
- Urbina, M. H. and Huckauf, A. 2007. pEYEdit: Gaze-based text entry via pie menus. In Conference Abstracts. 14th European Conference on Eye Movements ECEM2007. Kliegl, R. & Brenstein, R. (Eds.) (2007), 165-165.
Tuesday, May 6, 2008
Gaze interaction hits mainstream news
140 entries.
Great to see mainstream interest of gaze driven interaction. Gaming is truly one area where there is a huge potential, but it also depends on more accessible eye trackers. There is a movement for open source based eye tracking but the robustness for everyday usage is still remains at large. The system Stephen Vickers have developed is using the Tobii X120 eye tracker which is clearly out of range for all but the small group of users whom are granted financial support for their much needed assistive technology.
Have faith
In general, all new technology initially comes at a high cost due to intensive research and development but over time becomes accessible for the larger population. As an example, not many could imagine that satellite GPS navigation would be commonplace and really cheap a decade or two ago. Today mass-collaboration on the net is really happening making the rate of technology development exponential. Make sure to watch Google Techtalk Don Tapscott on Wikinomics.
Tuesday, April 15, 2008
Alea Technologies
A clear usage for their technology is users with disabilities such as ALS. The software contains a "desktop" system which acts as a launcher for other applications (Windows natives, Grid, Cogain). In general, they seem to target Tobii Technologies who have been very successful with their MyTobii application running on the P10 eye tracker. The game is on.
Quote:
Tracking technology | Hybrid infrared video eye- & head-tracking Binocular & monocular tracking |
Working Volume centered at 600 mm distance | 300 x 200 x 200 mm3 [WxHxD] |
Accuracy , static | 0.5°, typical |
Accuracy , over full working volume | 1°, typical |
Sampling Rate | 50 Hz |
Max. head-movement velocity | 15 cm/s |
Recovery-time after tracking loss (head was too fast or moved out of range) | 40 ms |
System Dimensions | ca. 300 x 45 x 80 mm3 [WxHxD] |
Mounting Options | on monitor via VESA-adapter on Tablet-PC via customized interfaces |
System Weight | ca. 1,2 kg. |
Thursday, March 27, 2008
RApid GAze-Based Interaction Techniques (RAGABITS)
Stephen Vickers at the Computer Human Interaction Research Group at the De Montfort University, Uk have developed interaction techniques that allows gaze based control of several popular online virtual worlds such as World of Warcraft or Second Life. This research will be presented at ETRA 2008, US under the title RAGABITS (RApid GAze-Based Interaction Techniques) and is espcially intented for users with severe motor impairments.
Selection method seems stable. None of the usual jitter can be seen. Nice!
Quote from http://www.ioct.dmu.ac.uk/projects/eyegaze.html
"Online virtual worlds and games (MMORPG's) have much to offer users with severe motor disabilities. It gives this user group the opportunity as entirely able-bodied to others in the virtual world. if they so wish. The extent to which a user has to reveal their disability becomes a privacy issue. Many of the avatars in Second Life appear as stylized versions of the users that control them and that stylization is the choice of the user. This choice is equally appropriate for disabled users. While the appearance of the user's avatar may not reveal the disability of the person that controls it, the behavior and speed or interaction in the world may do.
Many users with severe motor impairments may not be able to operate a keyboard or hand mouse and may also struggle with speech and head movement. Eye gaze is one method of interaction that has been used successfully in enabling access to desktop environments. However, simply emulating a mouse using eye gaze is not sufficient for interaction in online virtual worlds and the users privacy can be exposed unless efficient gaze-based interaction techniques, appropriate to activities in on-line worlds and games can be provided.
Wednesday, February 20, 2008
Inspiration: ZoomNavigator (Skovsgaard, 2008)
Abstract
The goal of this research is to estimate the maximum amount of noise of a pointing device that still makes interaction with a Windows interface possible. This work proposes zoom as an alternative activation method to the more well-known interaction methods (dwell and two-step-dwell activation). We present a magnifier called ZoomNavigator that uses the zoom principle to interact with an interface. Selection by zooming was tested with white noise in a range of 0 to 160 pixels in radius on an eye tracker and a standard mouse. The mouse was found to be more accurate than the eye tracker. The zoom principle applied allowed successful interaction with the smallest targets found in the Windows environment even with noise up to about 80 pixels in radius. The work suggests that the zoom interaction gives the user a possibility to make corrective movement during activation time eliminating the waiting time found in all types of dwell activations. Furthermore zooming can be a promising way to compensate for inaccuracies on low-resolution eye trackers or for instance if people have problems controlling the mouse due to hand tremors.
The sequence of images are screenshots from ZoomNavigator showing
a zoom towards a Windows file called ZoomNavigator.exe.
Two-step zoom
The two-step zoom activation is demonstrated in the video below by IT University of Copenhagen (ITU) research director prof. John Paulin Hansen. Notice how the error rate is reduced by the zooming style of interaction, making it suitable for applications with need for detailed discrimination. It might be slower but error rates drops significantly.
"Dwell is the traditional way of making selections by gaze. In the video we compare dwell to magnification and zoom. While the hit-rate is 10 % with dwell on a 12 x 12 pixels target, it is 100 % for both magnification and zoom. Magnification is a two-step process though, while zoom only takes on selection. In the experiment, the initiation of a selection is done by pressing the spacebar. Normally, the gaze tracking system will do this automatically when the gaze remains within a limited area for more than approx. 100 ms"
For more information see the publications of the ITU.
Inspiration: COGAIN
"COGAIN (Communication by Gaze Interaction) integrates cutting-edge expertise on interface technologies for the benefit of users with disabilities. COGAIN belongs to the eInclusion strategic objective of IST. COGAIN focuses on improving the quality of life for those whose life is impaired by motor-control disorders, such as ALS or CP. COGAIN assistive technologies will empower the target group to communicate by using the capabilities they have and by offering compensation for capabilities that are deteriorating. The users will be able to use applications that help them to be in control of the environment, or achieve a completely new level of convenience and speed in gaze-based communication. Using the technology developed in the network, text can be created quickly by eye typing, and it can be rendered with the user's own voice. In addition to this, the network will provide entertainment applications for making the life of the users more enjoyable and more equal. COGAIN believes that assistive technologies serve best by providing applications that are both empowering and fun to use."
A short introduction by Dr Richard Bates, a research fellow at the School of Computing Sciences at the De Montfort University in Leicester, can be downloaded either as presentation slides or paper.
The COGAIN network is a rich source of information on gaze interaction. A set of tools developed within the network has been made publicly software available for download. Make sure to check out the video demonstations of various gaze interaction tools.
Participating organizations within the COGAIN network.
Finland
University of Tampere (UTA)
Coordinator
Denmark
IT University of Copenhagen (ITU)
Vice Coordinator
Bispebjerg Hospital (BH)
Danish Centre for Assistive Technology (DC)
Risø National Laboratory (RISOE)
Danmarks Tekniske Universitet (DTU)
Germany
Technische Universität Dresden (TU Dresden)
Universität Koblenz-Landau (UNI KO-LD)
Universität zu Lübeck (UzL)
Italy
Hewlett Packard Italiana SRL (HP)
Politecnico di Torino (POLITO)
Lithuania
Sweden
Tobii Technology (Tobii)
Västra Götalands Läns Landsting (Sahlgrenska University Hospital) (DART)
United Kingdom
ACE Centre Advisory Trust Ltd (ACE)
The Chancellor, Masters and Scholars of the University of Cambridge (UCAM)
De Montfort University (DMU)
Loughborough University (LBORO)
Japan
Tokyo Institute of Technology (TIT)
Switzerland
Universität Zürich (UNIZH)
Spain
Universidad Publica de Navarra (UPNA)
Czech
Czech Technical University (CTU)
France
Metrovision (MV)
United States
LC Technologies (LC)
EyeTech Digital systems (EYETECH)