Friday, July 1, 2011
Eyetrax webcam eye tracker from Carnegie Mellon
"Eyetrax is dynamic eye tracking software that uses a simple stationary web camera to detect eye movement. It can be used as a motionless computer interface and is especially useful when working with ALS patients. Additionally, the non-obtrusive nature of the program allows it to work perfectly to discretely generate hotspot maps for marketing purposes". The system is developed by Joseph Fernandez, Skylar Roebuck and Jonathon Smereka and was demonstrated at the Multimedia Computing Demos on May 3rd at Carnegie Mellon.
Utechzone demos
Recently Taiwanese Utechzone demonstrated a little game at Taipei Computex 2011.
Utechzone also demonstrated a driver fatigue detection system which is housed in a smaller formfactor. This system tracks the eye (open/closed) but doesn't perform gaze estimation. The video also shows the underlying gaze tracking system used in their Spring system which appears to have some issues with glasses.
Fast forward to 1 minute in
Utechzone also demonstrated a driver fatigue detection system which is housed in a smaller formfactor. This system tracks the eye (open/closed) but doesn't perform gaze estimation. The video also shows the underlying gaze tracking system used in their Spring system which appears to have some issues with glasses.
Fast forward to 1 minute in
Wednesday, June 29, 2011
Mobile gaze-based screen interaction in 3D environments (D. Mardanbeigi, 2011)
Diako Mardanbeigi presents a method that enables the user to interact with any planar digital display in a 3D environment using a head-mounted eye tracker. An effective method for identifying the screens in the field of view of the user is also presented which can be applied in a general scenario in which multiple users can interact with multiple screens. Diakos PhD project at ITU Copenhagen concerns mobile gaze-based interaction.
Related publication:
Related publication:
Monday, June 27, 2011
Setscan EyeLock - Law enforcement training system
Setscan, a Canadian supplier of training equipment for law enforcement and military have partnered with Arrington Research to develop a binocular headmounted system with associated software called Eye Lock. The system aims at evaluating and optimizing officers allocation of visual attention. Looking at the right thing is obviously important as milliseconds count when guns are drawn. The eye tracking system is the same as those used for any natural-scene perception research but the market adaptation and focus to meet the needs of a specific domain is interesting.
UCSF using eye tracking to detect early stages of neurodegeneration
Sabes Lab at University of California, San Francisco are using high speed eye tracking systems to study eye movements as a tool for detecting neurodegenerative diseases. The data collected including response time, fixation accuracy and saccade velocity. These are important parameters that could identify approaching or existing neurodegenerative conditions such as Alzheimer. This area holds a great market potential and is feasible in a near future as the remote systems are coming closer to meeting the requirements of tracker speed and accuracy.
Tuesday, June 21, 2011
The EyeHarp: An Eye Tracking Based Musical Instrument
The main goal of the Zacharias Vamvakousis EyeHarp project is to allow people with paralysis resulting from Amyotrophic Lateral Sclerosis to play music using only their eyes. To build this, Zacharias was inspired by the EyeWriter open source initiative: "...a low-cost eye-tracking apparatus & custom software that allows graffiti writers and artists with paralysis resulting from Amyotrophic lateral sclerosis to draw using only their eyes". Zacharias spent only 50 euros to build his eye tracker using a modified version of the Sony PS3 eye camera. The application is implemented in openframeworks v0.6.
Alternatively, the instrument can be controlled using the mouse pointer (MouseHarp version). Then the free software camera mouse can be used to control the instrument with head movements. Any technology that can take control of the mouse pointer can be used in order to control the instrument. That way the mouseHarp could be an appropriate instrument for many cases of people with physical disabilities. The mouseHarp version is completely independent from the eyeWriter project. Combining the mouseHarp source with the source of the eyeWriter project, we get the eyeHarp! A low-cost gaze controlled musical instrument! Both versions are free and open source.
The EyeHarp project is part of Zacharias master thesis in Sound And Music Computing in UPF, Barcelona. His supervisor is Rafael Ramirez.
A paper on the application has been published:
- Vamvakousis Z., Ramirez R. (2011) The Eyeharp: Aa Eye-Tracking-based Musical Instrument. SMC Conference 2011, Padova, Italy (PDF)
Tuesday, June 7, 2011
Grinbath's EyeGuide
Texas based Grinbath recently announced the EyeGuide head mounted tracker. It's main competitive advantage is the low cost $1495, academic discounts are available ($1,179). The device captures eye images using a wireless camera, running on three AAA batteries, and streams these to a computer for processing. The package includes basic software for analysis and visualization. See the whitepaper for more information.
1 comments
Labels:
eye tracker,
head mounted
Monday, June 6, 2011
Proceedings from Novel Gaze-Controlled Applications 2011 online
The proceedings from the Novel Gaze-Controlled Applications 2011 conference are now available online. The conference that took place at the Blekinge Institute of Technology in Sweden during May 26-27 presented 11 talks covering a wide range of topics from gaming and gaze interaction to eye tracking solutions. Unfortunately I was unable to attend but luckily I'll have a couple of days interesting reading ahead. Kudos to Veronica Sundstedt and Charlotte Sennersten for organizing the event.
- Hyakunin-Eyesshu: a tabletop Hyakunin-Isshu game with computer opponent by the action prediction based on gaze detection
Michiya Yamamoto, Munehiro Komeda, Takashi Nagamatsu, Tomio Watanabe
Full text: PDF Online
- Gaze and voice controlled drawing
Jan van der Kamp, Veronica Sundstedt
Full text: PDF Online
- Eye tracking within the packaging design workflow: interaction with physical and virtual shelves
Chip Tonkin, Andrew D. Ouzts, Andrew T. Duchowski
Full text: PDF Online
- Designing gaze-supported multimodal interactions for the exploration of large image collections
Sophie Stellmach, Sebastian Stober, Andreas Nürnberger, Raimund Dachselt
Full text: PDF Online
- Comparison of gaze-to-objects mapping algorithms
Oleg Špakov
Full text: PDF Online
- Evaluation of a remote webcam-based eye tracker
Henrik Skovsgaard, Javier San Agustin, Sune Alstrup Johansen, John Paulin Hansen, Martin Tall
Full text: PDF Online
- An open-source low-cost eye-tracking system for portable real-time and offline tracking
Nicolas Schneider, Peter Bex, Erhardt Barth, Michael Dorr
Full text: PDF Online
- Gaze interaction from bed
John Paulin Hansen, Javier San Augustin, Henrik Skovsgaard
Full text: PDF Online
- Mobile gaze-based screen interaction in 3D environments
Diako Mardanbegi, Dan Witzner Hansen
Full text: PDF Online
- Towards intelligent user interfaces: anticipating actions in computer games
Hendrik Koesling, Alan Kenny, Andrea Finke, Helge Ritter, Seamus McLoone, Tomas Ward
Full text: PDF Online
- Exploring interaction modes for image retrieval
Corey Engelman, Rui Li, Jeff Pelz, Pengcheng Shi, Anne Haake
Full text: PDF Online
0
comments
Labels:
conference
Monday, May 9, 2011
"Read my Eyes" - A presentation of the ITU Gaze Tracker
During the last month the guys at IT University of Copenhagen has been involved in the making of a video that's intended to introduce the ITU Gaze Tracker, an open source eye tracker, to a wider audience. The production has been carried out in collaboration with the Communication Department at the university and features members of the group, students of the HCI class and Birger Bergmann Jeppesen who has had ALS since 1996. Many thanks to all involved, especially Birger & co for taking interest and participating in evaluation of the system.
1 comments
Labels:
eye tracker,
inspiration,
ITU,
low cost,
open source,
typing
Monday, May 2, 2011
1st International Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction
During the UbiComp 2011 in Beijing in September the 1st International Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction (PETMEI) will be held. Keynote speaker is Jeff B. Pelz who has considerable experience with eye tracking during natural tasks. The call for paper is out, see details below.
"Recent developments in mobile eye tracking equipment and automated eye movement analysis point the way toward unobtrusive eye-based human-computer interfaces that are pervasively usable in everyday life. We call this new paradigm pervasive eye tracking – continuous eye monitoring and analysis 24/7. The potential applications for the ability to track and analyze eye movements anywhere and anytime call for new research to further develop and understand visual behaviour and eye-based interaction in daily life settings. PETMEI 2011 will focus on pervasive eye tracking as a trailblazer for mobile eye-based interaction and eye-based context-awareness. We provide a forum for researchers from human-computer interaction, context-aware computing, and eye tracking to discuss techniques and applications that go beyond classical eye tracking and stationary eye-based interaction. We want to stimulate and explore the creativity of these communities with respect to the implications, key research challenges, and new applications for pervasive eye tracking in ubiquitous computing. The long-term goal is to create a strong interdisciplinary research community linking these fields together and to establish the workshop as the premier forum for research on pervasive eye tracking."
Important Dates
- Paper Submission: May 30, 2011
- Notification of Acceptance: June 27, 2011
- Camera-ready due: July 11, 2011
- Workshop: September 18, 2011
Topics
Topics of interest cover computational methods, new applications and use cases, as well as eye tracking technology for pervasive eye tracking and mobile eye-based interaction. Topics of interest include, but are not limited to:
Methods
Topics of interest cover computational methods, new applications and use cases, as well as eye tracking technology for pervasive eye tracking and mobile eye-based interaction. Topics of interest include, but are not limited to:
Methods
- Computer vision tools for face, eye detection and tracking
- Pattern recognition/machine learning for gaze and eye movement analysis
- Integration of pervasive eye tracking and context-aware computing
- Real-time multi-modality sensor fusion
- Techniques for eye tracking on portable devices
- Methods for long-term gaze and eye movement monitoring and analysis
- Gaze modeling for development of conversational agents
- Evaluation of context-aware systems and interfaces
- User studies on impact of and user experience with pervasive eye tracking
- Visual and non-visual feedback for eye-based interfaces
- Interaction techniques including multimodal approaches
- Analysis and interpretation of attention in HCI
- Dual and group eye tracking
Applications
- Mobile eye-based interaction with public displays, tabletops, and smart environments
- Eye-based activity and context recognition
- Pervasive healthcare, e.g. mental health monitoring or rehabilitation
- Autism research
- Daily life usability studies and market research
- Mobile attentive user interfaces
- Security and privacy for pervasive eye tracking systems
- Eye tracking in automotive research
- Eye tracking in multimedia research
- Assistive systems, e.g. mobile eye-based text entry
- Mobile eye tracking and interaction for augmented and virtual reality
- Eye-based human-robot and human-agent interaction
- Cognition-aware systems and user interfaces
- Human factors in mobile eye-based interaction
- Eye movement measures in affective computing
Technologies
- New devices for portable and wearable eye tracking
- Extension of existing systems for mobile interaction
See the submission details for more information.
0
comments
Labels:
conference
Subscribe to:
Posts (Atom)