Download instructions as PDF (8.1Mb)
Tuesday, August 17, 2010
How to build low cost eye tracking glasses for head mounted system (M. Kowalik, 2010)
Michał Kowalik of the Faculty of Computer Science and Information Technology at the West Pomeranian University of Technology in Szczecin, Poland, has put together a great DIY instruction for a headmounted system using the ITU Gaze Tracker. The camera of choice is the Microsoft LifeCam VX-1000 which has been modified by removing the casing and IR filter. In addition, three IR LEDs illuminate the eye using power from the USB cabel. This is then mounted on a pair of safety glasses, just like Jason Babcock & Jeff Pelz previously have done. Total cost of the hardware less than 50€. Neat. Thanks Michal.
Download instructions as PDF (8.1Mb)
Download instructions as PDF (8.1Mb)
2
comments
Labels:
eye tracker,
low cost,
open source
Monday, August 16, 2010
Call for Papers: ACM Transactions Special Issue on Eye Gaze
ACM Transactions on Interactive Intelligent Systems
Special Issue on Eye Gaze in Intelligent Human-Machine Interaction
This special issue will report on state-of-the-art computational models, systems, and studies that concern eye gaze in intelligent and natural human-machine communication. The nonexhaustive list of topics below indicates the range of appropriate topics; in case of doubt, please contact the guest editors. Papers that focus mainly on eye tracking hardware and software as such will be relevant (only) if they make it clear how the advances reported open up new possibilities for the use of eye gaze in at least one of the ways listed above.
Special Issue on Eye Gaze in Intelligent Human-Machine Interaction
Aims and Scope
Partly because of the increasing availability of nonintrusive and high-performance eye tracking devices, recent years have seen a growing interest in incorporating human eye gaze in intelligent user interfaces. Eye gaze has been used as a pointing mechanism in direct manipulation interfaces, for example, to assist users with “locked-in syndrome”. It has also been used as a reflection of information needs in web search and as a basis for tailoring information presentation. Detection of joint attention as indicated by eye gaze has been used to facilitate computer-supported human-human communication. In conversational interfaces, eye gaze has been used to improve language understanding and intention recognition. On the output side, eye gaze has been incorporated into the multimodal behavior of embodied conversational agents. Recent work on human-robot interaction has explored eye gaze in incremental language processing, visual scene processing, and conversation engagement and grounding.This special issue will report on state-of-the-art computational models, systems, and studies that concern eye gaze in intelligent and natural human-machine communication. The nonexhaustive list of topics below indicates the range of appropriate topics; in case of doubt, please contact the guest editors. Papers that focus mainly on eye tracking hardware and software as such will be relevant (only) if they make it clear how the advances reported open up new possibilities for the use of eye gaze in at least one of the ways listed above.
Topics
- Empirical studies of eye gaze in human-human communication that provide new insight into the role of eye gaze and suggest implications for the use of eye gaze in intelligent systems. Examples include new empirical findings concerning eye gaze in human language processing, in human-vision processing, and in conversation management.
- Algorithms and systems that incorporate eye gaze for human-computer interaction and human-robot interaction. Examples include gaze-based feedback to information systems; gaze-based attention modeling; exploiting gaze in automated language processing; and controlling the gaze behavior of embodied conversational agents or robots to enable grounding, turn-taking, and engagement.
- Applications that demonstrate the value of incorporating eye gaze in practical systems to enable intelligent human-machine communication.
Guest Editors
- Elisabeth André, University of Augsburg, Germany (contact: andre[at]informatik[dot]uni-augsburg.de)
- Joyce Chai, Michigan State University, USA
Important Dates
- By December 15th, 2010: Submission of manuscripts
- By March 23rd, 2011: Notification about decisions on initial submissions
- By June 23rd, 2011: Submission of revised manuscripts
- By August 25th, 2011: Notification about decisions on revised manuscripts
- By September 15th, 2011: Submission of manuscripts with final minor changes
- Starting October, 2011: Publication of the special issue on the TiiS website and subsequently in the ACM Digital Library and as a printed issue
0
comments
Labels:
hci
Tuesday, August 10, 2010
Eye control for PTZ cameras in video surveillance
Bartosz Kunka, a PhD student at the Gdańsk University of Technology have employed a remote gaze-tracking system called Cyber-Eye to control PTZ cameras in video surveillance and video-conference systems. The movie prepared for system presentation on Research Challange at SIGGRAPH 2010 in Los Angeles.
0
comments
Labels:
attentive interface,
eye tracker,
hci,
navigation,
security,
zoom
Wednesday, August 4, 2010
EOG used to play Super Mario
Came across some fun work by Waterloo labs that demos how to use a bunch of electrodes and a custom processing board to do signal analysis and estimate eye movement gestures though measuring EOG. It means you'll have to glance at the roof or floor to issue commands (no gaze point-of-regard estimation). Good thing is that the technology doesn't suffer from issues with light, optics and sensors that often makes video based eye tracking and gaze point-of-regard estimation complex. Bad thing is that it requires custom hardware, mounting of electrodes and wires, besides that the interaction style appears to involve looking away from what you are really interested in.
0
comments
Labels:
game,
modalities
Subscribe to:
Posts (Atom)