Monday, March 5, 2012
RealGaze Glasses
Just came across the RealGaze glasses which is being developed by Devon Greco et al. He's father was diagnosed with ALS some years ago and given that Devon has been tinkering with electronics since early on he set out to build an eye tracker. For a prototype the result looks good, I guess the form factor feels familiar. There isn't too much meat available at the moment other than big ambitions to manufacturer an affordable device. Most of us would love to see that happen!
Thursday, February 16, 2012
Eyewriter & Not Impossible Foundation
The Eyewriter project which helped Tony 'TemptOne' Quan to draw again was originally document by Mick Ebeling. This material has been incorporated into a documentary called "Getting up" and recently won the audience award at the Slamdance. Movie buff Christopher Campbell wrote a short review on his blog. Great job on raising awareness, hope you guys find funding to further develop the software.
Getting Up: The Tempt One Story Trailer
How to build an EyeWriter
Wednesday, February 15, 2012
Prelude for ETRA2012
The program for the Eye Tracking Research & Applications (ETRA'12) is out and contains several really interesting papers this year.
Two supplementary videos surfaced the other day and comes from the User Interface & Software Engineering group at the Otto-von-Guericke-Universität in Germany. In addition the authors, Sophie Stellmach and Raimund Dachselt, have a paper submitted for the ACM SIGCHI Conference on Human Factors in Computing Systems" (CHI'12). Abstracts and videos below.
Abstract I (ETRA)
Remote pan-and-zoom control for the exploration of large information spaces is of interest for various application areas, such as browsing through medical data in sterile environments or investigating geographic information systems on a distant display. In this context, considering a user's visual attention for pan-and-zoom operations could be of interest. In this paper, we investigate the potential of gaze-supported panning in combination with different zooming modalities: (1) a mouse scroll wheel, (2) tilting a handheld device, and (3) touch gestures on a smartphone. Thereby, it is possible to zoom in at a location a user currently looks at (i.e., gaze-directed pivot zoom). These techniques have been tested with Google Earth by ten participants in a user study. While participants were fastest with the already familiar mouse-only base condition, the user feedback indicates a particularly high potential of the gaze-supported pivot zooming in combination with a scroll wheel or touch gesture.
To be presented at the ETRA12.
Abstract II
Since eye gaze may serve as an efficient and natural input for steering in virtual 3D scenes, we investigate the design of eye gaze steering user interfaces (UIs) in this paper. We discuss design considerations and propose design alternatives based on two selected steering approaches differing in input condition (discrete vs. continuous) and velocity selection (constant vs. gradient-based). The proposed UIs have been iteratively advanced based on two user studies with twelve participants each. In particular, the combination of continuous and gradient-based input shows a high potential, because it allows for gradually changing the moving speed and direction depending on a user's point-of-regard. This has the advantage of reducing overshooting problems and dwell-time activations. We also investigate discrete constant input for which virtual buttons are toggled using gaze dwelling. As an alternative, we propose the Sticky Gaze Pointer as a more flexible way of discrete input.
To be presented at the ETRA12.
Abstract III (CHI)
While eye tracking has a high potential for fast selection tasks, it is often regarded as error-prone and unnatural, especially for gaze-only interaction. To improve on that, we propose gaze-supported interaction as a more natural and effective way combining a user's gaze with touch input from a handheld device. In particular, we contribute a set of novel and practical gaze-supported selection techniques for distant displays. Designed according to the principle gaze suggests, touch confirms they include an enhanced gaze-directed cursor, local zoom lenses and more elaborated techniques utilizing manual fine positioning of the cursor via touch. In a comprehensive user study with 24 participants, we investigated the potential of these techniques for different target sizes and distances. All novel techniques outperformed a simple gaze-directed cursor and showed individual advantages. In particular those techniques using touch for fine cursor adjustments (MAGIC touch) and for cycling through a list of possible close-to-gaze targets (MAGIC tab) demonstrated a high overall performance and usability.
To be presented at the CHI12.
Two supplementary videos surfaced the other day and comes from the User Interface & Software Engineering group at the Otto-von-Guericke-Universität in Germany. In addition the authors, Sophie Stellmach and Raimund Dachselt, have a paper submitted for the ACM SIGCHI Conference on Human Factors in Computing Systems" (CHI'12). Abstracts and videos below.
Abstract I (ETRA)
Remote pan-and-zoom control for the exploration of large information spaces is of interest for various application areas, such as browsing through medical data in sterile environments or investigating geographic information systems on a distant display. In this context, considering a user's visual attention for pan-and-zoom operations could be of interest. In this paper, we investigate the potential of gaze-supported panning in combination with different zooming modalities: (1) a mouse scroll wheel, (2) tilting a handheld device, and (3) touch gestures on a smartphone. Thereby, it is possible to zoom in at a location a user currently looks at (i.e., gaze-directed pivot zoom). These techniques have been tested with Google Earth by ten participants in a user study. While participants were fastest with the already familiar mouse-only base condition, the user feedback indicates a particularly high potential of the gaze-supported pivot zooming in combination with a scroll wheel or touch gesture.
Since eye gaze may serve as an efficient and natural input for steering in virtual 3D scenes, we investigate the design of eye gaze steering user interfaces (UIs) in this paper. We discuss design considerations and propose design alternatives based on two selected steering approaches differing in input condition (discrete vs. continuous) and velocity selection (constant vs. gradient-based). The proposed UIs have been iteratively advanced based on two user studies with twelve participants each. In particular, the combination of continuous and gradient-based input shows a high potential, because it allows for gradually changing the moving speed and direction depending on a user's point-of-regard. This has the advantage of reducing overshooting problems and dwell-time activations. We also investigate discrete constant input for which virtual buttons are toggled using gaze dwelling. As an alternative, we propose the Sticky Gaze Pointer as a more flexible way of discrete input.
Abstract III (CHI)
While eye tracking has a high potential for fast selection tasks, it is often regarded as error-prone and unnatural, especially for gaze-only interaction. To improve on that, we propose gaze-supported interaction as a more natural and effective way combining a user's gaze with touch input from a handheld device. In particular, we contribute a set of novel and practical gaze-supported selection techniques for distant displays. Designed according to the principle gaze suggests, touch confirms they include an enhanced gaze-directed cursor, local zoom lenses and more elaborated techniques utilizing manual fine positioning of the cursor via touch. In a comprehensive user study with 24 participants, we investigated the potential of these techniques for different target sizes and distances. All novel techniques outperformed a simple gaze-directed cursor and showed individual advantages. In particular those techniques using touch for fine cursor adjustments (MAGIC touch) and for cycling through a list of possible close-to-gaze targets (MAGIC tab) demonstrated a high overall performance and usability.
Tuesday, January 10, 2012
EyeTech EyeOn
A video from EyeTech that features Michael who suffers from Thoracic Outlet Syndrome (TOS). Great little clip that shows what computer control without gaze-adapted interfaces comes down to. Luckily Michael can use voice recognition software for typing, text input using eye movements alone is a cumbersome process (source).
Monday, December 12, 2011
Monday, November 14, 2011
EyeDrone - Eye gaze controlled navigation
Learn more at:http://bme.ccny.cuny.edu/faculty/lparra/eyeNavigate/index.html
Explore the Neural Engineering Group at CCNY - http://neuralengr.com
Wednesday, November 9, 2011
Low Cost Eye Tracking for Commercial Gaming: eyeAsteroids and eyeShoot in the Dark!
Latest update from Stephen Vickers and Howell Istance at the Centre for Computational Intelligence, De Montfort University who have been doing research and development of gaze-based gaming for several years now. Their latest project has been shortlisted in the consumer category of The Enginner Awards 2011. Congratulations on an awesome job!
"Using your eyes and where you are looking to interact with computer games represents an exciting new direction that game play can take, following the success of whole body interaction enabled by the Kinect and the Wii. The Innovation Fellowship has supported the development and demonstration of a low-cost eye tracker by De Montfort University, in collaboration with Sleepy Dog, the East Midlands games company that produced the Buzz-it controller and games. The low-cost eye tracker utilised the ITU Gazetracking library and was produced as a fully working pre-production prototype. In the project, three different games were produced to demonstrate different ways in which eye gaze can be used to make game play more immersive and exciting.
This video demostrates two of them.
"Using your eyes and where you are looking to interact with computer games represents an exciting new direction that game play can take, following the success of whole body interaction enabled by the Kinect and the Wii. The Innovation Fellowship has supported the development and demonstration of a low-cost eye tracker by De Montfort University, in collaboration with Sleepy Dog, the East Midlands games company that produced the Buzz-it controller and games. The low-cost eye tracker utilised the ITU Gazetracking library and was produced as a fully working pre-production prototype. In the project, three different games were produced to demonstrate different ways in which eye gaze can be used to make game play more immersive and exciting.
This video demostrates two of them.
- eyeAsteroids The ship flies towards where you are looking and the space bar is used to fire.
- eyeShoot in the Dark! The torch shines at where you are looking and the mouse is used to move the cross-hair and fire.
Monday, September 19, 2011
SMI Glasses now available
The SMI Glasses, which we got a sneak-peak at before last Christmas and specs in April are now available for sale. SMI engineers have managed to squeeze two 30Hz eye tracking cameras and a high definition scene camera into the frame weighting only 75 grams. The advantage of tracking both eyes is that it enables parallax compensation where the accuracy is maintained regardless if you look close or far in the scene. In addition, binocular tracking most often yields better accuracy as it provides the opportunity to average both samples or to discard one with low validity. The HD scene camera captures 24 frames per second at 1280x960 pixels, a clear advantage. All in all, better late than never, SMI is back with a highly competitive product in the market.
Wednesday, August 31, 2011
ETRA 2012 - Call for papers
"The Seventh
ACM Symposium on Eye Tracking Research & Applications (ETRA 2012)
will be held in Santa Barbara, California on March 28th-30th, 2012. The
ETRA conference series focuses on all aspects of eye movement research
and applications across a wide range of disciplines. The symposium
presents research that advances the state-of-the-art in these areas,
leading to new capabilities in gaze tracking systems, gaze aware
applications, gaze based interaction, eye movement data analysis, etc.
For ETRA 2012, we invite papers in all areas of eye tracking research
and applications."
ETRA 2012 THEME: MOBILE EYE TRACKING
Mobile devices are becoming more powerful every day. Embedding eye
tracking and gaze-based applications in mobile devices raises new
challenges and opportunities to many aspects of eye tracking research.
ETRA 2012 invites papers tackling the challenges, or exploring new
research opportunities of mobile eye tracking.
GENERAL AREAS OF INTEREST
Eye Tracking Technology
Advances in eye tracking hardware, software and algorithms such as:
2D and 3D eye tracking systems, calibration, low cost eye tracking,
natural light eye tracking, predictive models, etc.
Eye Tracking Data Analysis
Methods, procedures and analysis tools for processing raw gaze data as
well as fixations and gaze patterns. Example topics are: scan path
analysis, fixation detection algorithms, and visualization techniques of
gaze data.
Visual Attention and Eye Movement Control
Applied and experimental studies investigating visual attention
and eye movements to gain insight in eye movement control, cognition and
attention, or for design evaluation of visual stimuli. Examples are:
usability and web studies using eye tracking, and eye movement behavior
in everyday activities such as driving and reading.
Eye Tracking Applications
Eye
tracking as a human-computer input method, either as a replacement to
traditional input methods or as a complement. Examples are: assistive
technologies, gaze enhanced interaction and interfaces, multimodal
interaction, gaze in augmented and mixed reality systems, and
gaze-contingent displays.
SUBMISSIONS
Authors
are invited to submit original work in the formats of Full paper (8
pages) and Short paper (4 pages). The papers will undergo a rigorous
review process assessing the originality and quality of the work as well
as the relevance for eye tracking research and applications. Papers
presented at ETRA 2012 will be available in the ACM digital library.
Submission formats and instructions are available at the conference web
site.
IMPORTANT DATES
Oct. 07th, 2011 Full Paper abstracts submission due
Oct. 14th, 2011 Full Papers submission due
Nov. 21st, 2011 Full Papers acceptance notification
Dec. 07th, 2011 Short Papers submission due
Jan. 16th, 2012 Short Papers acceptance notification
Jan. 23rd, 2012 Camera ready papers due
CONFERENCE VENUE
ETRA
2012 will be held at the gorgeous Doubletree Resort in Santa Barbara,
California, a 24-acre, mission-style resort hotel facing the Pacific
Ocean, located on one of Southern California’s most beautiful
coastlines.
SPONSORSHIP
ETRA
2012 is co-sponsored by the ACM Special Interest Group in
Computer-Human Interaction (SIGCHI), and the ACM Special Interest Group
on Computer Graphics and Interactive Techniques (SIGGRAPH).
CONFERENCE CHAIRS
Carlos Hitoshi Morimoto - University of São Paulo, Brazil
Howell Istance - De Montfort University, UK
PROGRAM CHAIRS
Jeffrey B. Mulligan - NASA, USA
Pernilla Qvarfordt - FX Palo Alto, USA
PROGRAM AREA CHAIRS
Andrew Duchowski, Clemson University, USA
Päivi Majaranta, University of Tampere, Finland
Joe Goldberg, Oracle, USA
Shu-Chieh Wu, NASA Ames Research Center, USA
Qiang Ji, Rensselaer Polytechnic Institute, USA
Jeff Pelz, Rochester Institute of Technology, USA
Moshe Eizenman, University of Toronto, USA
Päivi Majaranta, University of Tampere, Finland
Joe Goldberg, Oracle, USA
Shu-Chieh Wu, NASA Ames Research Center, USA
Qiang Ji, Rensselaer Polytechnic Institute, USA
Jeff Pelz, Rochester Institute of Technology, USA
Moshe Eizenman, University of Toronto, USA
Tuesday, August 9, 2011
Subscribe to:
Posts (Atom)