Appears video-games can be beneficial your your eyes despite what mother said. Came across this article in the British Daily Mail, found it inspiring and believe it could be done even better with an interactive application using real-time gaze tracking input. Direct quote:
"A six-year-old boy who nearly went blind in one eye can now see again after he was told to play on a Nintendo games console. Ben Michaels suffered from amblyopia, or severe lazy eye syndrome in his right eye from the age of four. His vision had decreased gradually in one eye and without treatment his sight loss could have become permanent. His GP referred him to consultant Ken Nischal who prescribed the unusual daily therapy. Ben, from Billericay, Essex, spends two hours a day playing Mario Kart on a Nintendo DS with his twin Jake. Ben wears a patch over his good eye to make his lazy one work harder. The twins' mother, Maxine, 36, said that from being 'nearly blind' in the eye, Ben's vision had 'improved 250 per cent' in the first week. She said: 'When he started he could not identify our faces with his weak eye. Now he can read with it although he is still a way off where he ought to be. 'He was very cooperative with the patch, it had phenomenal effect and we’re very pleased.' Mr Nischal of Great Ormond Street Children's Hospital, said the therapy helped children with weak eyesight because computer games encourage repetitive eye movement, which trains the eye to focus correctly. 'A games console is something children can relate to. It allows us to deliver treatment quicker,' he said. 'What we don’t know is whether improvement is solely because of improved compliance, ie the child sticks with the patch more, or whether there is a physiological improvement from perceptual visual learning.' The consultant added that thousands of youngsters and adults could benefit from a similar treatment." (source)
Monday, June 28, 2010
Tuesday, June 15, 2010
Speech Dasher: Fast Writing using Speech and Gaze (K. Vertanen & D. MacKay, 2010)
A new version of the Dasher typing interface utilizes speech recognition provided by the CMU PocketSphinx software doubles the typing performance measured in words per minute. From a previous 20 WPM to 40 WPM, close to what a professional keyboard jockey may produce.
Abstract
Speech Dasher allows writing using a combination of speech and a zooming interface. Users first speak what they want to write and then they navigate through the space of recognition hypotheses to correct any errors. Speech Dasher’s model combines information from a speech recognizer, from the
user, and from a letter-based language model. This allows fast writing of anything predicted by the recognizer while also providing seamless fallback to letter-by-letter spelling for words not in the recognizer’s predictions. In a formative user study, expert users wrote at 40 (corrected) words per
minute. They did this despite a recognition word error rate of 22%. Furthermore, they did this using only speech and the direction of their gaze (obtained via an eye tracker).
Abstract
Speech Dasher allows writing using a combination of speech and a zooming interface. Users first speak what they want to write and then they navigate through the space of recognition hypotheses to correct any errors. Speech Dasher’s model combines information from a speech recognizer, from the
user, and from a letter-based language model. This allows fast writing of anything predicted by the recognizer while also providing seamless fallback to letter-by-letter spelling for words not in the recognizer’s predictions. In a formative user study, expert users wrote at 40 (corrected) words per
minute. They did this despite a recognition word error rate of 22%. Furthermore, they did this using only speech and the direction of their gaze (obtained via an eye tracker).
- Speech Dasher: Fast Writing using Speech and Gaze
Keith Vertanen and David J.C. MacKay. CHI '10: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, To appear. [Abstract+videos, PDF, BibTeX]
0
comments
Labels:
assistive technology,
interface design,
typing
Wednesday, May 26, 2010
Abstracts from SWAET 2010
The booklet containing the abstracts for the Scandinavian Workshop on Applied Eye Tracking (SWAET) is now available for download, 55 pages about 1Mb. The abstracts spans a wide range from gaze interaction to behavior and perception. A short one page format makes it attractive to venture into a multitude of domains and acts as a nice little starting point for digging deeper. Shame I couldn't attend, maybe next year. Kudos for making this booklet available.
Title | Authors |
Eye movements during mental imagery are not perceptual re-enactments | R. Johansson, J. Holsanova, K. Holmqvist |
Practice eliminates "looking at nothing" | A. Scholz, K. Mehlhorn, J.F. Krems |
Learning Perceptual Skills for Medical Diagnosis via Eye Movement Modeling Examples on Patient Video Cases | H. Jarodzka, T. Balslev, K. Holmqvist, K. Scheiter, M. Nyström, P. Gerjets, B. Eika |
Objective, subjective, and commercial information: The impact of presentation format on the visual inspection and selection of Web search results | Y. Kammerer, P. Gerjets |
Eye Movements and levels of attention: A stimulus driven approach | F.B. Mulvey, K. Holmqvist, J.P Hansen |
Player‟s gaze in a collaborative Tetris game | P Jermann, M-A Nüssli, W. Li |
Naming associated objects: Evidence for parallel processing | L. Mortensen , A.S. Meyer |
Reading Text Messages - An Eye-Tracking Study on the Influence of Shortening Strategies on Reading Comprehension | V. Heyer, H. Hopp |
Eye movement measures to study the online comprehension of long (illustrated) texts | J. Hyönä, J.K, Kaakinen |
Self-directed Learning Skills in Air-traffic Control; A Cued Retrospective Reporting Study | L.W. van Meeuwen, S. Brand-Gruwel, J.J. G. van Merriënboer, J. J.P.R. de Bock, P.A. Kirschner |
Drivers‟ characteristic sequences of eye and head movements in intersections | A. Bjelkemyr, K. Smith |
Comparing the value of different cues when using the retrospective think aloud method in web usability testing with eye tracking | A. Olsen |
Gaze behavior and instruction sensitivity of Children with Autism Spectrum Disorders when viewing pictures of social scenes | B. Rudsengen, F. Volden |
Impact of cognitive workload on gaze-including interaction | S. Trösterer, J. Dzaack |
Interaction with mainstream interfaces using gaze alone | H. Skovsgaard, J. P. Hansen, J.C. Mateo |
Stereoscopic Eye Movement Tracking: Challenges and Opportunities in 3D | G. Öqvist Seimyr, A. Appelholm, H. Johansson R. Brautaset |
Sampling frequency – what speed do I need? | R. Andersson, M. Nyström, K. Holmqvist |
Effect of head-distance on raw gaze velocity | M-A Nüssli, P. Jermann |
Quantifying and modelling factors that influence calibration and data quality | M. Nyström, R. Andersson, J. van de Weijer |
0
comments
Labels:
HumLab,
Lund Universitet,
SWAET
Monday, May 24, 2010
EyePhone - Mobil gaze interaction from University of Dartmouth
From the Emiliano Miluzzo and the group at Sensorlab, part of the Computer Science department at University of Dartmouth, comes the EyePhone which enables rudimentary gaze based interaction for tablet computers. Contemporary devices often utilizes touch based interaction, this creates a problem with occlusion where the hands covers large parts of the display. EyePhone could help to alleviate this issue. The prototype system demonstrated offers enough accuracy for an interfaces based on a 3x3 grid layout but with better hardware and algorithms there is little reason why this couldn't be better. However, a major issue with a mobile system is just the mobility of both the user and the hardware, in practice this means that not only the individual head moments has to be compensated for but also movements of the camera in essentially all degrees of freedom. Not an easy thing to solve but it's not a question of "if" but "when". Perhaps there is something that could be done using the angular position sensors many mobile devices already have embedded. This is an excellent first step and with a thrilling potential. Additional information is available in the M.I.T Technology Review article.
Abstract
As smartphones evolve researchers are studying new techniques to ease the human-mobile interaction. We propose EyePhone, a novel "hands free" interfacing system capable of driving mobile applications/functions using only the user's eyes movement and actions (e.g., wink). EyePhone tracks the user's eye movement across the phone's display using the camera mounted on the front of the phone; more speci cally, machine learning algorithms are used to: i) track the eye and infer its position on the mobile phone display as a user views a particular application; and ii) detect eye blinks that emulate mouse clicks to activate the target application under view. We present a prototype implementation of EyePhone on a Nokia 810, which is capable of tracking the position of the eye on the display, mapping this positions to a function that is activated by a wink. At no time does the user have to physically touch the phone display.
Abstract
As smartphones evolve researchers are studying new techniques to ease the human-mobile interaction. We propose EyePhone, a novel "hands free" interfacing system capable of driving mobile applications/functions using only the user's eyes movement and actions (e.g., wink). EyePhone tracks the user's eye movement across the phone's display using the camera mounted on the front of the phone; more speci cally, machine learning algorithms are used to: i) track the eye and infer its position on the mobile phone display as a user views a particular application; and ii) detect eye blinks that emulate mouse clicks to activate the target application under view. We present a prototype implementation of EyePhone on a Nokia 810, which is capable of tracking the position of the eye on the display, mapping this positions to a function that is activated by a wink. At no time does the user have to physically touch the phone display.
- Emiliano Miluzzo, Tianyu Wang, Andrew T. Campbell, EyePhone: Activating Mobile Phones With Your Eyes. To appear in Proc. of The Second ACM SIGCOMM Workshop on Networking, Systems, and Applications on Mobile Handhelds (MobiHeld'10), New Delhi, India, August 30, 2010. [pdf] [video]
0
comments
Labels:
eye tracker,
hci,
inspiration,
mobility,
prototype
Thursday, May 20, 2010
Magnetic Eye Tracking Device from Arizona State University
A group of students at the Arizona State University have revisited the scleral search coil to develop a new low-cost Magnetic Eye Tracking Device (METD). The entrepreneurs aim at making this technology available to the public at an affordable $4000 and are primarily targeting disabled. More information is available at ASU News.
If your new to eye tracking it should be noted that the reporter claiming that common video based systems uses infrared lasers is just silly. It's essentially light-sources working in the IR spectrum (similar to the LED in your remote control).
If your new to eye tracking it should be noted that the reporter claiming that common video based systems uses infrared lasers is just silly. It's essentially light-sources working in the IR spectrum (similar to the LED in your remote control).
0
comments
Labels:
eye tracker,
low cost
Friday, April 30, 2010
GazePad: Low-cost remote webcam eye tracking
Came across the GazeLib low-cost remote eye tracking project today which uses ordinary webcams without IR illumination. The accuracy is pretty low but it's really nice to see another low-cost approach for assistive technology.
"GazeLib is a programming library which making real-time low-cost gaze tracking becomes possible. The library provide functions performing remote gaze tracking under ambient lighting condition using a single, low cost, off-the-shelf webcam. Developers can easily build gaze tracking technologies implemented applications in only few lines of code. GazeLib project focuses on promoting gaze tracking technology to consumer-grade human computer interfaces by reducing the price, emphasizing ease-of-use, increasing the extendibility, and enhancing the flexibility and mobility."
"GazeLib is a programming library which making real-time low-cost gaze tracking becomes possible. The library provide functions performing remote gaze tracking under ambient lighting condition using a single, low cost, off-the-shelf webcam. Developers can easily build gaze tracking technologies implemented applications in only few lines of code. GazeLib project focuses on promoting gaze tracking technology to consumer-grade human computer interfaces by reducing the price, emphasizing ease-of-use, increasing the extendibility, and enhancing the flexibility and mobility."
0
comments
Labels:
eye tracker,
low cost,
prototype
Monday, April 26, 2010
Freie Universität Berlin presents gaze controlled car
From the Freie Universität in Berlin comes a working prototype for a systems that allows direct steering by eye movements alone. The prototype was demonstrated in front of a large group journalist at the former Berlin Tempelhof Airport. Gaze data from a head-mounted SMI eye tracker is feed into the control system of the Spirit of Berlin, a platform for autonomous navigation. Similar to the gaze controlled robot we presented at CHI09 the platform offers a coupling between the turning of the wheels and the gaze data coordinate space (eg. look left and car drives left). Essentially its a mapping onto a 2D plane where deviations from the center issues steering commands and the degree of turning is modulated by the distance. Potentially interesting when coupled with other sensors that in combination offers offer driver support, for example if an object in the vehicles path that driver has not seen. Not to mention scenarios including individuals with disabilities and/or machine learning. The work has been carried out under guidance by professor Raúl Rojas as part AutoNOMOS project which has been running since 2006 after inspiration from the Stanford autonomos car project.
More info in the press-release.
More info in the press-release.
0
comments
Labels:
navigation,
prototype
Sunday, April 25, 2010
Wednesday, April 14, 2010
Open-source gaze tracker awarded Research Pearls of ITU Copenhagen
The open-source eye tracker ITU Gaze Tracker primarily developed by Javier San Augustin, Henrik Skovsgaard and myself has been awarded the Research Pearls of the IT University of Copenhagen. A presentation will be held at ITU on May 6th at 2pm. The software released one year ago have seen more than 5000 downloads by students and hobbyist around the world. It's rapidly approaching a new release which will offer better performance and stability for remote tracking and many bug fixes in general. The new version adds support for a whole range of new HD web cameras. These provides a vastly improved image quality that finally brings hope for a low-cost, open, flexible and reasonably performing solution. The ambitious goal strives to make eye tracking technology available for everyone, regardless of available resources. Follow the developments at the forum. Additional information is available at the ITU Gaze Group.
"The Open-Source ITU Gaze Tracker"
Abstract:
Gaze tracking offers them the possibility of interacting with a computer by just using eye movements, thereby making users more independent. However, some people (for example users with a severe disability) are excluded from access to gaze interaction due to the high prices of commercial systems (above 10.000€). Gaze tracking systems built from low-cost and off-the-shelf components have the potential of facilitating access to the technology and bring prices down.
The ITU Gaze Tracker is an off-the-shelf system that uses an inexpensive web cam or a video camera to track the user’s eye. It is free and open-source, offering users the possibility of trying out gaze interaction technology for a cost as low as 20€, and to adapt and extend the software to suit specific needs.
In this talk we will present the open-source ITU Gaze Tracker and show the different scenarios in which the system has been used and evaluated.
"The Open-Source ITU Gaze Tracker"
Abstract:
Gaze tracking offers them the possibility of interacting with a computer by just using eye movements, thereby making users more independent. However, some people (for example users with a severe disability) are excluded from access to gaze interaction due to the high prices of commercial systems (above 10.000€). Gaze tracking systems built from low-cost and off-the-shelf components have the potential of facilitating access to the technology and bring prices down.
The ITU Gaze Tracker is an off-the-shelf system that uses an inexpensive web cam or a video camera to track the user’s eye. It is free and open-source, offering users the possibility of trying out gaze interaction technology for a cost as low as 20€, and to adapt and extend the software to suit specific needs.
In this talk we will present the open-source ITU Gaze Tracker and show the different scenarios in which the system has been used and evaluated.
2
comments
Labels:
eye tracker,
gazetracker,
ITU,
low cost,
open source
Monday, April 12, 2010
Digital avatars gets human-like eye movements
William Steptoe of University College London got his research on using eye tracking to give digital avatars human-like eye movements covered in an article by New Scientist. It turns out that "on average, the participants were able to identify 88 per cent of truths correctly when the avatars had eye movement, but only 70 per cent without. Spotting lies was harder, but eye movement helped: 48 per cent accuracy compared with 39 per cent without. Steptoe will present the results at the 2010 Conference on Human Factors in Computing Systems in Atlanta, Georgia, next week."
Subscribe to:
Posts (Atom)