Thursday, February 16, 2012

Eyewriter & Not Impossible Foundation

The Eyewriter project which helped Tony 'TemptOne' Quan to draw again was originally document by Mick Ebeling. This material has been incorporated into a documentary called "Getting up" and recently won the audience award at the Slamdance. Movie buff Christopher Campbell wrote a short review on his blog. Great job on raising awareness, hope you guys find funding to further develop the software.



Getting Up: The Tempt One Story Trailer




How to build an EyeWriter

Wednesday, February 15, 2012

Prelude for ETRA2012

The program for the Eye Tracking Research & Applications (ETRA'12) is out and contains several really interesting papers this year.

Two supplementary videos surfaced the other day and comes from the User Interface & Software Engineering group at the Otto-von-Guericke-Universität in Germany. In addition the authors, Sophie Stellmach and Raimund Dachselt, have a paper submitted for the ACM SIGCHI Conference on Human Factors in Computing Systems" (CHI'12). Abstracts and videos below.

Abstract I (ETRA)
Remote pan-and-zoom control for the exploration of large information spaces is of interest for various application areas, such as browsing through medical data in sterile environments or investigating geographic information systems on a distant display. In this context, considering a user's visual attention for pan-and-zoom operations could be of interest. In this paper, we investigate the potential of gaze-supported panning in combination with different zooming modalities: (1) a mouse scroll wheel, (2) tilting a handheld device, and (3) touch gestures on a smartphone. Thereby, it is possible to zoom in at a location a user currently looks at (i.e., gaze-directed pivot zoom). These techniques have been tested with Google Earth by ten participants in a user study. While participants were fastest with the already familiar mouse-only base condition, the user feedback indicates a particularly high potential of the gaze-supported pivot zooming in combination with a scroll wheel or touch gesture.


 
To be presented at the ETRA12.


Abstract II
Since eye gaze may serve as an efficient and natural input for steering in virtual 3D scenes, we investigate the design of eye gaze steering user interfaces (UIs) in this paper. We discuss design considerations and propose design alternatives based on two selected steering approaches differing in input condition (discrete vs. continuous) and velocity selection (constant vs. gradient-based). The proposed UIs have been iteratively advanced based on two user studies with twelve participants each. In particular, the combination of continuous and gradient-based input shows a high potential, because it allows for gradually changing the moving speed and direction depending on a user's point-of-regard. This has the advantage of reducing overshooting problems and dwell-time activations. We also investigate discrete constant input for which virtual buttons are toggled using gaze dwelling. As an alternative, we propose the Sticky Gaze Pointer as a more flexible way of discrete input.


To be presented at the ETRA12.


Abstract III (CHI)
While eye tracking has a high potential for fast selection tasks, it is often regarded as error-prone and unnatural, especially for gaze-only interaction. To improve on that, we propose gaze-supported interaction as a more natural and effective way combining a user's gaze with touch input from a handheld device. In particular, we contribute a set of novel and practical gaze-supported selection techniques for distant displays. Designed according to the principle gaze suggests, touch confirms they include an enhanced gaze-directed cursor, local zoom lenses and more elaborated techniques utilizing manual fine positioning of the cursor via touch. In a comprehensive user study with 24 participants, we investigated the potential of these techniques for different target sizes and distances. All novel techniques outperformed a simple gaze-directed cursor and showed individual advantages. In particular those techniques using touch for fine cursor adjustments (MAGIC touch) and for cycling through a list of possible close-to-gaze targets (MAGIC tab) demonstrated a high overall performance and usability.


 
To be presented at the CHI12.