Wednesday, May 26, 2010

Abstracts from SWAET 2010


The booklet containing the abstracts for the Scandinavian Workshop on Applied Eye Tracking (SWAET) is now available for download, 55 pages about 1Mb. The abstracts spans a wide range from gaze interaction to behavior and perception. A short one page format makes it attractive to venture into a multitude of domains and acts as a nice little starting point for digging deeper. Shame I couldn't attend, maybe next year. Kudos for making this booklet available.




 Title Authors
 Eye movements during mental imagery are not perceptual re-enactments R. Johansson, J. Holsanova, K. Holmqvist
 Practice eliminates "looking at nothing" A. Scholz, K. Mehlhorn, J.F. Krems
 Learning Perceptual Skills for Medical Diagnosis via Eye Movement  Modeling Examples on Patient Video Cases H. Jarodzka, T. Balslev, K. Holmqvist, K. Scheiter, M. Nyström, P. Gerjets, B. Eika
 Objective, subjective, and commercial information: The impact of presentation format on the visual inspection and selection of Web search results Y. Kammerer, P. Gerjets
 Eye Movements and levels of attention: A stimulus driven approach F.B. Mulvey, K. Holmqvist, J.P Hansen
 Player‟s gaze in a collaborative Tetris game P Jermann, M-A Nüssli, W. Li
 Naming associated objects: Evidence for parallel processing L. Mortensen , A.S. Meyer
 Reading Text Messages - An Eye-Tracking Study on the Influence of Shortening Strategies on Reading Comprehension V. Heyer, H. Hopp
 Eye movement measures to study the online comprehension of long (illustrated) texts J. Hyönä, J.K, Kaakinen
 Self-directed Learning Skills in Air-traffic Control; A Cued Retrospective Reporting Study L.W. van Meeuwen, S. Brand-Gruwel, J.J. G. van Merriënboer, J. J.P.R. de Bock, P.A. Kirschner
 Drivers‟ characteristic sequences of eye and head movements in intersections A. Bjelkemyr, K. Smith
 Comparing the value of different cues when using the retrospective think aloud method in web usability testing with eye tracking A. Olsen
 Gaze behavior and instruction sensitivity of Children with Autism Spectrum Disorders when viewing pictures of social scenes B. Rudsengen, F. Volden
 Impact of cognitive workload on gaze-including interaction S. Trösterer, J. Dzaack
 Interaction with mainstream interfaces using gaze alone H. Skovsgaard, J. P. Hansen, J.C. Mateo
 Stereoscopic Eye Movement Tracking: Challenges and Opportunities in 3D G. Öqvist Seimyr, A. Appelholm, H. Johansson R. Brautaset
 Sampling frequency – what speed do I need? R. Andersson, M. Nyström, K. Holmqvist
 Effect of head-distance on raw gaze velocity M-A Nüssli, P. Jermann
 Quantifying and modelling factors that influence calibration and data quality M. Nyström, R. Andersson,  J. van de Weijer

Monday, May 24, 2010

EyePhone - Mobil gaze interaction from University of Dartmouth

From the Emiliano Miluzzo and the group at Sensorlab, part of the Computer Science department at University of Dartmouth, comes the EyePhone which enables rudimentary gaze based interaction for tablet computers. Contemporary devices often utilizes touch based interaction, this creates a problem with occlusion where the hands covers large parts of the display. EyePhone could help to alleviate this issue. The prototype system demonstrated offers enough accuracy for an interfaces based on a 3x3 grid layout but with better hardware and algorithms there is little reason why this couldn't be better. However, a major issue with a mobile system is just the mobility of both the user and the hardware, in practice this means that not only the individual head moments has to be compensated for but also movements of the camera in essentially all degrees of freedom. Not an easy thing to solve but it's not a question of "if" but "when". Perhaps there is something that could be done using the angular position sensors many mobile devices already have embedded. This is an excellent first step and with a thrilling potential. Additional information is available in the M.I.T Technology Review article.



Abstract
As smartphones evolve researchers are studying new techniques to ease the human-mobile interaction. We propose EyePhone, a novel "hands free" interfacing system capable of driving mobile applications/functions using only the user's eyes movement and actions (e.g., wink). EyePhone tracks the user's eye movement across the phone's display using the camera mounted on the front of the phone; more speci cally, machine learning algorithms are used to: i) track the eye and infer its position on the mobile phone display as a user views a particular application; and ii) detect eye blinks that emulate mouse clicks to activate the target application under view. We present a prototype implementation of EyePhone on a Nokia 810, which is capable of tracking the position of the eye on the display, mapping this positions to a function that is activated by a wink. At no time does the user have to physically touch the phone display.


Figures. Camera images, eye region of interests and reported accuracies. Click to enlarge.

  • Emiliano Miluzzo, Tianyu Wang, Andrew T. Campbell, EyePhone: Activating Mobile Phones With Your Eyes. To appear in Proc. of The Second ACM SIGCOMM Workshop on Networking, Systems, and Applications on Mobile Handhelds (MobiHeld'10), New Delhi, India, August 30, 2010. [pdf] [video]

Thursday, May 20, 2010

Magnetic Eye Tracking Device from Arizona State University

A group of students at the Arizona State University have revisited the scleral search coil to develop a new low-cost Magnetic Eye Tracking Device (METD). The entrepreneurs aim at making this technology available to the public at an affordable $4000 and are primarily targeting disabled. More information is available at ASU News.



If your new to eye tracking it should be noted that the reporter claiming that common video based systems uses infrared lasers is just silly. It's essentially light-sources working in the IR spectrum (similar to the LED in your remote control).