Download instructions as PDF (8.1Mb)
Tuesday, August 17, 2010
How to build low cost eye tracking glasses for head mounted system (M. Kowalik, 2010)
Michał Kowalik of the Faculty of Computer Science and Information Technology at the West Pomeranian University of Technology in Szczecin, Poland, has put together a great DIY instruction for a headmounted system using the ITU Gaze Tracker. The camera of choice is the Microsoft LifeCam VX-1000 which has been modified by removing the casing and IR filter. In addition, three IR LEDs illuminate the eye using power from the USB cabel. This is then mounted on a pair of safety glasses, just like Jason Babcock & Jeff Pelz previously have done. Total cost of the hardware less than 50€. Neat. Thanks Michal.
Download instructions as PDF (8.1Mb)
Download instructions as PDF (8.1Mb)
2
comments
Labels:
eye tracker,
low cost,
open source
Monday, August 16, 2010
Call for Papers: ACM Transactions Special Issue on Eye Gaze
ACM Transactions on Interactive Intelligent Systems
Special Issue on Eye Gaze in Intelligent Human-Machine Interaction
This special issue will report on state-of-the-art computational models, systems, and studies that concern eye gaze in intelligent and natural human-machine communication. The nonexhaustive list of topics below indicates the range of appropriate topics; in case of doubt, please contact the guest editors. Papers that focus mainly on eye tracking hardware and software as such will be relevant (only) if they make it clear how the advances reported open up new possibilities for the use of eye gaze in at least one of the ways listed above.
Special Issue on Eye Gaze in Intelligent Human-Machine Interaction
Aims and Scope
Partly because of the increasing availability of nonintrusive and high-performance eye tracking devices, recent years have seen a growing interest in incorporating human eye gaze in intelligent user interfaces. Eye gaze has been used as a pointing mechanism in direct manipulation interfaces, for example, to assist users with “locked-in syndrome”. It has also been used as a reflection of information needs in web search and as a basis for tailoring information presentation. Detection of joint attention as indicated by eye gaze has been used to facilitate computer-supported human-human communication. In conversational interfaces, eye gaze has been used to improve language understanding and intention recognition. On the output side, eye gaze has been incorporated into the multimodal behavior of embodied conversational agents. Recent work on human-robot interaction has explored eye gaze in incremental language processing, visual scene processing, and conversation engagement and grounding.This special issue will report on state-of-the-art computational models, systems, and studies that concern eye gaze in intelligent and natural human-machine communication. The nonexhaustive list of topics below indicates the range of appropriate topics; in case of doubt, please contact the guest editors. Papers that focus mainly on eye tracking hardware and software as such will be relevant (only) if they make it clear how the advances reported open up new possibilities for the use of eye gaze in at least one of the ways listed above.
Topics
- Empirical studies of eye gaze in human-human communication that provide new insight into the role of eye gaze and suggest implications for the use of eye gaze in intelligent systems. Examples include new empirical findings concerning eye gaze in human language processing, in human-vision processing, and in conversation management.
- Algorithms and systems that incorporate eye gaze for human-computer interaction and human-robot interaction. Examples include gaze-based feedback to information systems; gaze-based attention modeling; exploiting gaze in automated language processing; and controlling the gaze behavior of embodied conversational agents or robots to enable grounding, turn-taking, and engagement.
- Applications that demonstrate the value of incorporating eye gaze in practical systems to enable intelligent human-machine communication.
Guest Editors
- Elisabeth André, University of Augsburg, Germany (contact: andre[at]informatik[dot]uni-augsburg.de)
- Joyce Chai, Michigan State University, USA
Important Dates
- By December 15th, 2010: Submission of manuscripts
- By March 23rd, 2011: Notification about decisions on initial submissions
- By June 23rd, 2011: Submission of revised manuscripts
- By August 25th, 2011: Notification about decisions on revised manuscripts
- By September 15th, 2011: Submission of manuscripts with final minor changes
- Starting October, 2011: Publication of the special issue on the TiiS website and subsequently in the ACM Digital Library and as a printed issue
0
comments
Labels:
hci
Tuesday, August 10, 2010
Eye control for PTZ cameras in video surveillance
Bartosz Kunka, a PhD student at the Gdańsk University of Technology have employed a remote gaze-tracking system called Cyber-Eye to control PTZ cameras in video surveillance and video-conference systems. The movie prepared for system presentation on Research Challange at SIGGRAPH 2010 in Los Angeles.
0
comments
Labels:
attentive interface,
eye tracker,
hci,
navigation,
security,
zoom
Wednesday, August 4, 2010
EOG used to play Super Mario
Came across some fun work by Waterloo labs that demos how to use a bunch of electrodes and a custom processing board to do signal analysis and estimate eye movement gestures though measuring EOG. It means you'll have to glance at the roof or floor to issue commands (no gaze point-of-regard estimation). Good thing is that the technology doesn't suffer from issues with light, optics and sensors that often makes video based eye tracking and gaze point-of-regard estimation complex. Bad thing is that it requires custom hardware, mounting of electrodes and wires, besides that the interaction style appears to involve looking away from what you are really interested in.
0
comments
Labels:
game,
modalities
Sunday, July 18, 2010
Monday, June 28, 2010
Video-games can be beneficial!
Appears video-games can be beneficial your your eyes despite what mother said. Came across this article in the British Daily Mail, found it inspiring and believe it could be done even better with an interactive application using real-time gaze tracking input. Direct quote:
"A six-year-old boy who nearly went blind in one eye can now see again after he was told to play on a Nintendo games console. Ben Michaels suffered from amblyopia, or severe lazy eye syndrome in his right eye from the age of four. His vision had decreased gradually in one eye and without treatment his sight loss could have become permanent. His GP referred him to consultant Ken Nischal who prescribed the unusual daily therapy. Ben, from Billericay, Essex, spends two hours a day playing Mario Kart on a Nintendo DS with his twin Jake. Ben wears a patch over his good eye to make his lazy one work harder. The twins' mother, Maxine, 36, said that from being 'nearly blind' in the eye, Ben's vision had 'improved 250 per cent' in the first week. She said: 'When he started he could not identify our faces with his weak eye. Now he can read with it although he is still a way off where he ought to be. 'He was very cooperative with the patch, it had phenomenal effect and we’re very pleased.' Mr Nischal of Great Ormond Street Children's Hospital, said the therapy helped children with weak eyesight because computer games encourage repetitive eye movement, which trains the eye to focus correctly. 'A games console is something children can relate to. It allows us to deliver treatment quicker,' he said. 'What we don’t know is whether improvement is solely because of improved compliance, ie the child sticks with the patch more, or whether there is a physiological improvement from perceptual visual learning.' The consultant added that thousands of youngsters and adults could benefit from a similar treatment." (source)
"A six-year-old boy who nearly went blind in one eye can now see again after he was told to play on a Nintendo games console. Ben Michaels suffered from amblyopia, or severe lazy eye syndrome in his right eye from the age of four. His vision had decreased gradually in one eye and without treatment his sight loss could have become permanent. His GP referred him to consultant Ken Nischal who prescribed the unusual daily therapy. Ben, from Billericay, Essex, spends two hours a day playing Mario Kart on a Nintendo DS with his twin Jake. Ben wears a patch over his good eye to make his lazy one work harder. The twins' mother, Maxine, 36, said that from being 'nearly blind' in the eye, Ben's vision had 'improved 250 per cent' in the first week. She said: 'When he started he could not identify our faces with his weak eye. Now he can read with it although he is still a way off where he ought to be. 'He was very cooperative with the patch, it had phenomenal effect and we’re very pleased.' Mr Nischal of Great Ormond Street Children's Hospital, said the therapy helped children with weak eyesight because computer games encourage repetitive eye movement, which trains the eye to focus correctly. 'A games console is something children can relate to. It allows us to deliver treatment quicker,' he said. 'What we don’t know is whether improvement is solely because of improved compliance, ie the child sticks with the patch more, or whether there is a physiological improvement from perceptual visual learning.' The consultant added that thousands of youngsters and adults could benefit from a similar treatment." (source)
6
comments
Labels:
inspiration
Tuesday, June 15, 2010
Speech Dasher: Fast Writing using Speech and Gaze (K. Vertanen & D. MacKay, 2010)
A new version of the Dasher typing interface utilizes speech recognition provided by the CMU PocketSphinx software doubles the typing performance measured in words per minute. From a previous 20 WPM to 40 WPM, close to what a professional keyboard jockey may produce.
Abstract
Speech Dasher allows writing using a combination of speech and a zooming interface. Users first speak what they want to write and then they navigate through the space of recognition hypotheses to correct any errors. Speech Dasher’s model combines information from a speech recognizer, from the
user, and from a letter-based language model. This allows fast writing of anything predicted by the recognizer while also providing seamless fallback to letter-by-letter spelling for words not in the recognizer’s predictions. In a formative user study, expert users wrote at 40 (corrected) words per
minute. They did this despite a recognition word error rate of 22%. Furthermore, they did this using only speech and the direction of their gaze (obtained via an eye tracker).
Abstract
Speech Dasher allows writing using a combination of speech and a zooming interface. Users first speak what they want to write and then they navigate through the space of recognition hypotheses to correct any errors. Speech Dasher’s model combines information from a speech recognizer, from the
user, and from a letter-based language model. This allows fast writing of anything predicted by the recognizer while also providing seamless fallback to letter-by-letter spelling for words not in the recognizer’s predictions. In a formative user study, expert users wrote at 40 (corrected) words per
minute. They did this despite a recognition word error rate of 22%. Furthermore, they did this using only speech and the direction of their gaze (obtained via an eye tracker).
- Speech Dasher: Fast Writing using Speech and Gaze
Keith Vertanen and David J.C. MacKay. CHI '10: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, To appear. [Abstract+videos, PDF, BibTeX]
0
comments
Labels:
assistive technology,
interface design,
typing
Wednesday, May 26, 2010
Abstracts from SWAET 2010
The booklet containing the abstracts for the Scandinavian Workshop on Applied Eye Tracking (SWAET) is now available for download, 55 pages about 1Mb. The abstracts spans a wide range from gaze interaction to behavior and perception. A short one page format makes it attractive to venture into a multitude of domains and acts as a nice little starting point for digging deeper. Shame I couldn't attend, maybe next year. Kudos for making this booklet available.
Title | Authors |
Eye movements during mental imagery are not perceptual re-enactments | R. Johansson, J. Holsanova, K. Holmqvist |
Practice eliminates "looking at nothing" | A. Scholz, K. Mehlhorn, J.F. Krems |
Learning Perceptual Skills for Medical Diagnosis via Eye Movement Modeling Examples on Patient Video Cases | H. Jarodzka, T. Balslev, K. Holmqvist, K. Scheiter, M. Nyström, P. Gerjets, B. Eika |
Objective, subjective, and commercial information: The impact of presentation format on the visual inspection and selection of Web search results | Y. Kammerer, P. Gerjets |
Eye Movements and levels of attention: A stimulus driven approach | F.B. Mulvey, K. Holmqvist, J.P Hansen |
Player‟s gaze in a collaborative Tetris game | P Jermann, M-A Nüssli, W. Li |
Naming associated objects: Evidence for parallel processing | L. Mortensen , A.S. Meyer |
Reading Text Messages - An Eye-Tracking Study on the Influence of Shortening Strategies on Reading Comprehension | V. Heyer, H. Hopp |
Eye movement measures to study the online comprehension of long (illustrated) texts | J. Hyönä, J.K, Kaakinen |
Self-directed Learning Skills in Air-traffic Control; A Cued Retrospective Reporting Study | L.W. van Meeuwen, S. Brand-Gruwel, J.J. G. van Merriënboer, J. J.P.R. de Bock, P.A. Kirschner |
Drivers‟ characteristic sequences of eye and head movements in intersections | A. Bjelkemyr, K. Smith |
Comparing the value of different cues when using the retrospective think aloud method in web usability testing with eye tracking | A. Olsen |
Gaze behavior and instruction sensitivity of Children with Autism Spectrum Disorders when viewing pictures of social scenes | B. Rudsengen, F. Volden |
Impact of cognitive workload on gaze-including interaction | S. Trösterer, J. Dzaack |
Interaction with mainstream interfaces using gaze alone | H. Skovsgaard, J. P. Hansen, J.C. Mateo |
Stereoscopic Eye Movement Tracking: Challenges and Opportunities in 3D | G. Öqvist Seimyr, A. Appelholm, H. Johansson R. Brautaset |
Sampling frequency – what speed do I need? | R. Andersson, M. Nyström, K. Holmqvist |
Effect of head-distance on raw gaze velocity | M-A Nüssli, P. Jermann |
Quantifying and modelling factors that influence calibration and data quality | M. Nyström, R. Andersson, J. van de Weijer |
0
comments
Labels:
HumLab,
Lund Universitet,
SWAET
Monday, May 24, 2010
EyePhone - Mobil gaze interaction from University of Dartmouth
From the Emiliano Miluzzo and the group at Sensorlab, part of the Computer Science department at University of Dartmouth, comes the EyePhone which enables rudimentary gaze based interaction for tablet computers. Contemporary devices often utilizes touch based interaction, this creates a problem with occlusion where the hands covers large parts of the display. EyePhone could help to alleviate this issue. The prototype system demonstrated offers enough accuracy for an interfaces based on a 3x3 grid layout but with better hardware and algorithms there is little reason why this couldn't be better. However, a major issue with a mobile system is just the mobility of both the user and the hardware, in practice this means that not only the individual head moments has to be compensated for but also movements of the camera in essentially all degrees of freedom. Not an easy thing to solve but it's not a question of "if" but "when". Perhaps there is something that could be done using the angular position sensors many mobile devices already have embedded. This is an excellent first step and with a thrilling potential. Additional information is available in the M.I.T Technology Review article.
Abstract
As smartphones evolve researchers are studying new techniques to ease the human-mobile interaction. We propose EyePhone, a novel "hands free" interfacing system capable of driving mobile applications/functions using only the user's eyes movement and actions (e.g., wink). EyePhone tracks the user's eye movement across the phone's display using the camera mounted on the front of the phone; more speci cally, machine learning algorithms are used to: i) track the eye and infer its position on the mobile phone display as a user views a particular application; and ii) detect eye blinks that emulate mouse clicks to activate the target application under view. We present a prototype implementation of EyePhone on a Nokia 810, which is capable of tracking the position of the eye on the display, mapping this positions to a function that is activated by a wink. At no time does the user have to physically touch the phone display.
Abstract
As smartphones evolve researchers are studying new techniques to ease the human-mobile interaction. We propose EyePhone, a novel "hands free" interfacing system capable of driving mobile applications/functions using only the user's eyes movement and actions (e.g., wink). EyePhone tracks the user's eye movement across the phone's display using the camera mounted on the front of the phone; more speci cally, machine learning algorithms are used to: i) track the eye and infer its position on the mobile phone display as a user views a particular application; and ii) detect eye blinks that emulate mouse clicks to activate the target application under view. We present a prototype implementation of EyePhone on a Nokia 810, which is capable of tracking the position of the eye on the display, mapping this positions to a function that is activated by a wink. At no time does the user have to physically touch the phone display.
- Emiliano Miluzzo, Tianyu Wang, Andrew T. Campbell, EyePhone: Activating Mobile Phones With Your Eyes. To appear in Proc. of The Second ACM SIGCOMM Workshop on Networking, Systems, and Applications on Mobile Handhelds (MobiHeld'10), New Delhi, India, August 30, 2010. [pdf] [video]
0
comments
Labels:
eye tracker,
hci,
inspiration,
mobility,
prototype
Thursday, May 20, 2010
Magnetic Eye Tracking Device from Arizona State University
A group of students at the Arizona State University have revisited the scleral search coil to develop a new low-cost Magnetic Eye Tracking Device (METD). The entrepreneurs aim at making this technology available to the public at an affordable $4000 and are primarily targeting disabled. More information is available at ASU News.
If your new to eye tracking it should be noted that the reporter claiming that common video based systems uses infrared lasers is just silly. It's essentially light-sources working in the IR spectrum (similar to the LED in your remote control).
If your new to eye tracking it should be noted that the reporter claiming that common video based systems uses infrared lasers is just silly. It's essentially light-sources working in the IR spectrum (similar to the LED in your remote control).
0
comments
Labels:
eye tracker,
low cost
Subscribe to:
Posts (Atom)