Friday, June 8, 2012
Eyecatcher - A 3D prototype combining Eyetracking with a Gestural Camera
Eyecatcher is a prototype combining eyetracking with a gestural camera on a dual screen setup. Created for the Oilrig process industry, this project was a collaborative exploration between ABB Corporate Research and Interactive Institute Umeå (blog).
Sunday, June 3, 2012
Copenhagen Business School: PhD position available
Copenhagen Business School invites applications for a vacant PhD scholarship in empirical modeling of eye movements in reading, writing and translation. The PhD position is offered at the Department of International Business Communication at the Copenhagen Business School (CBS). The Department of International Business Communication is a new department at CBS whose fields of interest include the role of language(-s) in interlingual and intercultural communication, the role of language and culture competences in organizations, the role of language and culture in communication technology and social technologies, as well as the teaching of language skills. The Department is dedicated to interdisciplinary and problem-oriented research. Considerable progress has been made in eye-tracking technology over the past decade, allowing to capture gaze behavior with free head movements. However, the imprecision of the measured signal makes it difficult to analyze the eye-gaze movement in reading tasks where a precise local resolution of the gaze samples is required to track the reader's gaze path over a text. The PhD position will investigate methods to cancel out the noise from the gaze signal. The PhD candidate will investigate, design and implement empirically-based models of eye-gaze movements in reading which take into account physical properties of the visual system in addition to background information, such as the purpose of the reading activity, the structure of the text, the quality of the gaze signal, etc. The PhD candidate should have:
More information available here.
|
Friday, June 1, 2012
Temporal Control In the EyeHarp Gaze-Controlled Musical Interface
The EyeHarp that I wrote about last summer is a gaze controlled musical instrument build by Zacharias Vamvakousis. In the video below he demonstrates how the interface is driven by the ITU Gaze Tracker and used to compose a loop which then improvise upon. On the hardware side a modified PS3 camera is used in combination with two infrared light sources. This setup was presented in New Interfaces for Musical Expression (NIME 2012) conference in Detroit a week ago, while it will be exhibited in Sonar, Barcelona on 14-16, June 2012. Great to see that such innovative interface being made open source and combined with the ITU tracker.
- Vamvakousis, Z. and Ramirez, R. (2012) Temporal Control In the EyeHarp Gaze-Controlled Musical Interface. In the proceedings on the 12th International Conference on New Interfaces for Musical Expression. 21-23 May 2012. Ann Arbor, Michigan, USA. (PDF)
Monday, April 23, 2012
Noise Challenges in Monomodal Gaze Interaction (Skovsgaard, 2011)
Henrik Skovsgaard of the ITU Gaze Group successfully defended his PhD thesis “Noise Challenges in Monomodal Gaze Interaction” at the IT University of Copenhagen on the 13th December 2011. The PhD thesis can be downloaded here.
ABSTRACT
- An assessment of a low-cost open-source gaze tracker and two eye tracking systems through an accuracy and precision test and a performance evaluation.
- Development and evaluation of a novel innovative 3D typing system with high tolerance to noise that is based on continuous panning and zooming.
- Development and evaluation of novel selection tools that compensate for noisy input during small-target selections in modern GUIs.
This thesis may be of particular interest for those working on the use of eye trackers for gaze interaction and how to deal with reduced data quality. The work in this thesis is accompanied by several software applications developed for the research projects that can be freely downloaded from the eyeInteract appstore (http://www.eyeinteract.com).
SUPERVISORS
- Associate professor John Paulin Hansen - ITU Copenhagen (main supervisor)
- Associate professor Dan Witzner Hansen - ITU Copenhagen (secondary supervisor)
ASSESSMENT COMMITTEE
- Professor Kasper Hornbæk - University of Copenhagen, Denmark
- Associate professor Scott MacKenzie - York University, Canada
- Associate professor Thomas Pederson - IT University of Copenhagen, Denmark (chairman)
Monday, March 12, 2012
SMI RED-M
Well, well, look here. A constellation of eye tracking manufacturers are joining in on the affordable market, perhaps defined some time ago by Mirametrix who launched at @ $5k. Tobii have their PC Eye, perfectly fine but at a cool $7k and is showcasing the new IS2 chipset but apparently can't do CEBIT12 demos. The new player is Sensomotoric Instruments, known for their high quality hardware and finely tuned algorithms. Their new contribution is the RED-M (M is for mini?). Even if the price hasn't been announced I would assume it's less than it's high speed fire-wire sibling, perhaps similar to the PCEye pricing?
The M-version is a small device made out of plastics that connects via USB 2.0 (assuming two plugs, one for power), it measures 240x25x33mm - that's pretty small and it's only 130 grams. This is a big difference from their prior models which have been very solid and made out of high quality materials and professional components. The accuracy is specified to 0.5deg, 50-75cm distance where the box is 320x210mm @ 60cm with a sample rate of 60/120Hz, in essence it's the low end version of the RED series where the top model is the super fast RED500 . Although it has yet to be demonstrated in operational state some material has appeared online. Below is the animated setup guide, you can find more information on their website. Looking good!
The M-version is a small device made out of plastics that connects via USB 2.0 (assuming two plugs, one for power), it measures 240x25x33mm - that's pretty small and it's only 130 grams. This is a big difference from their prior models which have been very solid and made out of high quality materials and professional components. The accuracy is specified to 0.5deg, 50-75cm distance where the box is 320x210mm @ 60cm with a sample rate of 60/120Hz, in essence it's the low end version of the RED series where the top model is the super fast RED500 . Although it has yet to be demonstrated in operational state some material has appeared online. Below is the animated setup guide, you can find more information on their website. Looking good!
- Technical specs (pdf)
- Flyer (pdf)
Monday, March 5, 2012
RealGaze Glasses
Just came across the RealGaze glasses which is being developed by Devon Greco et al. He's father was diagnosed with ALS some years ago and given that Devon has been tinkering with electronics since early on he set out to build an eye tracker. For a prototype the result looks good, I guess the form factor feels familiar. There isn't too much meat available at the moment other than big ambitions to manufacturer an affordable device. Most of us would love to see that happen!
Thursday, February 16, 2012
Eyewriter & Not Impossible Foundation
The Eyewriter project which helped Tony 'TemptOne' Quan to draw again was originally document by Mick Ebeling. This material has been incorporated into a documentary called "Getting up" and recently won the audience award at the Slamdance. Movie buff Christopher Campbell wrote a short review on his blog. Great job on raising awareness, hope you guys find funding to further develop the software.
Getting Up: The Tempt One Story Trailer
How to build an EyeWriter
Wednesday, February 15, 2012
Prelude for ETRA2012
The program for the Eye Tracking Research & Applications (ETRA'12) is out and contains several really interesting papers this year.
Two supplementary videos surfaced the other day and comes from the User Interface & Software Engineering group at the Otto-von-Guericke-Universität in Germany. In addition the authors, Sophie Stellmach and Raimund Dachselt, have a paper submitted for the ACM SIGCHI Conference on Human Factors in Computing Systems" (CHI'12). Abstracts and videos below.
Abstract I (ETRA)
Remote pan-and-zoom control for the exploration of large information spaces is of interest for various application areas, such as browsing through medical data in sterile environments or investigating geographic information systems on a distant display. In this context, considering a user's visual attention for pan-and-zoom operations could be of interest. In this paper, we investigate the potential of gaze-supported panning in combination with different zooming modalities: (1) a mouse scroll wheel, (2) tilting a handheld device, and (3) touch gestures on a smartphone. Thereby, it is possible to zoom in at a location a user currently looks at (i.e., gaze-directed pivot zoom). These techniques have been tested with Google Earth by ten participants in a user study. While participants were fastest with the already familiar mouse-only base condition, the user feedback indicates a particularly high potential of the gaze-supported pivot zooming in combination with a scroll wheel or touch gesture.
To be presented at the ETRA12.
Abstract II
Since eye gaze may serve as an efficient and natural input for steering in virtual 3D scenes, we investigate the design of eye gaze steering user interfaces (UIs) in this paper. We discuss design considerations and propose design alternatives based on two selected steering approaches differing in input condition (discrete vs. continuous) and velocity selection (constant vs. gradient-based). The proposed UIs have been iteratively advanced based on two user studies with twelve participants each. In particular, the combination of continuous and gradient-based input shows a high potential, because it allows for gradually changing the moving speed and direction depending on a user's point-of-regard. This has the advantage of reducing overshooting problems and dwell-time activations. We also investigate discrete constant input for which virtual buttons are toggled using gaze dwelling. As an alternative, we propose the Sticky Gaze Pointer as a more flexible way of discrete input.
To be presented at the ETRA12.
Abstract III (CHI)
While eye tracking has a high potential for fast selection tasks, it is often regarded as error-prone and unnatural, especially for gaze-only interaction. To improve on that, we propose gaze-supported interaction as a more natural and effective way combining a user's gaze with touch input from a handheld device. In particular, we contribute a set of novel and practical gaze-supported selection techniques for distant displays. Designed according to the principle gaze suggests, touch confirms they include an enhanced gaze-directed cursor, local zoom lenses and more elaborated techniques utilizing manual fine positioning of the cursor via touch. In a comprehensive user study with 24 participants, we investigated the potential of these techniques for different target sizes and distances. All novel techniques outperformed a simple gaze-directed cursor and showed individual advantages. In particular those techniques using touch for fine cursor adjustments (MAGIC touch) and for cycling through a list of possible close-to-gaze targets (MAGIC tab) demonstrated a high overall performance and usability.
To be presented at the CHI12.
Two supplementary videos surfaced the other day and comes from the User Interface & Software Engineering group at the Otto-von-Guericke-Universität in Germany. In addition the authors, Sophie Stellmach and Raimund Dachselt, have a paper submitted for the ACM SIGCHI Conference on Human Factors in Computing Systems" (CHI'12). Abstracts and videos below.
Abstract I (ETRA)
Remote pan-and-zoom control for the exploration of large information spaces is of interest for various application areas, such as browsing through medical data in sterile environments or investigating geographic information systems on a distant display. In this context, considering a user's visual attention for pan-and-zoom operations could be of interest. In this paper, we investigate the potential of gaze-supported panning in combination with different zooming modalities: (1) a mouse scroll wheel, (2) tilting a handheld device, and (3) touch gestures on a smartphone. Thereby, it is possible to zoom in at a location a user currently looks at (i.e., gaze-directed pivot zoom). These techniques have been tested with Google Earth by ten participants in a user study. While participants were fastest with the already familiar mouse-only base condition, the user feedback indicates a particularly high potential of the gaze-supported pivot zooming in combination with a scroll wheel or touch gesture.
Since eye gaze may serve as an efficient and natural input for steering in virtual 3D scenes, we investigate the design of eye gaze steering user interfaces (UIs) in this paper. We discuss design considerations and propose design alternatives based on two selected steering approaches differing in input condition (discrete vs. continuous) and velocity selection (constant vs. gradient-based). The proposed UIs have been iteratively advanced based on two user studies with twelve participants each. In particular, the combination of continuous and gradient-based input shows a high potential, because it allows for gradually changing the moving speed and direction depending on a user's point-of-regard. This has the advantage of reducing overshooting problems and dwell-time activations. We also investigate discrete constant input for which virtual buttons are toggled using gaze dwelling. As an alternative, we propose the Sticky Gaze Pointer as a more flexible way of discrete input.
Abstract III (CHI)
While eye tracking has a high potential for fast selection tasks, it is often regarded as error-prone and unnatural, especially for gaze-only interaction. To improve on that, we propose gaze-supported interaction as a more natural and effective way combining a user's gaze with touch input from a handheld device. In particular, we contribute a set of novel and practical gaze-supported selection techniques for distant displays. Designed according to the principle gaze suggests, touch confirms they include an enhanced gaze-directed cursor, local zoom lenses and more elaborated techniques utilizing manual fine positioning of the cursor via touch. In a comprehensive user study with 24 participants, we investigated the potential of these techniques for different target sizes and distances. All novel techniques outperformed a simple gaze-directed cursor and showed individual advantages. In particular those techniques using touch for fine cursor adjustments (MAGIC touch) and for cycling through a list of possible close-to-gaze targets (MAGIC tab) demonstrated a high overall performance and usability.
Tuesday, January 10, 2012
EyeTech EyeOn
A video from EyeTech that features Michael who suffers from Thoracic Outlet Syndrome (TOS). Great little clip that shows what computer control without gaze-adapted interfaces comes down to. Luckily Michael can use voice recognition software for typing, text input using eye movements alone is a cumbersome process (source).
Monday, December 12, 2011
Subscribe to:
Posts (Atom)