Friday, June 8, 2012
Eyecatcher - A 3D prototype combining Eyetracking with a Gestural Camera
Thursday, January 13, 2011
Face tracking for 3D displays without glasses.
Wednesday, August 4, 2010
EOG used to play Super Mario
Thursday, October 8, 2009
DoCoMo EOG update
Thanks Roman for the links!
Monday, November 3, 2008
The Conductor Interaction Method (Rachovides et al)
"This article proposes an alternative interaction method, the conductor interaction method (CIM), which aims to provide a more natural and easier-to-learn interaction technique. This novel interaction method extends existing HCI methods by drawing upon techniques found in human-human interaction. It is argued that the use of a two-phased multimodal interaction mechanism, using gaze for selection and gesture for manipulation, incorporated within a metaphor-based environment, can provide a viable alternative for interacting with a computer (especially for novice users). Both the model and an implementation of the CIM within a system are presented in this article. This system formed the basis of a number of user studies that have been performed to assess the effectiveness of the CIM, the findings of which are discussed in this work.
More specifically the CIM aims to provide the following.
—A More Natural Interface. The CIM will have an interface that utilizes gaze and gestures, but is nevertheless capable of supporting sophisticated activities. The CIM provides an interaction technique that is as natural as possible and is close to the human-human interaction methods with which users are already familiar. The combination of gaze and gestures allows the user to perform not only simple interactions with a computer, but also more complex interacones such as the selecting, editing, and placing of media objects.
—A Metaphor Supported Interface. In order to help the user understand and exploit the gaze and gesture interface, two metaphors have been developed. An orchestra metaphor is used to provide the environment in which the user interacts. A conductor metaphor is used for interacting within this environment. These two metaphors are discussed next.
—A Two-Phased Interaction Method. The CIM uses an interaction process where each modality is specific and has a particular function. The interaction between user and interface can be seen as a dialog that is comprised of two phases. In the first phase, the user selects the on-screen object by gazing at it. In the second phase, with the gesture interface the user is able to manipulate the selected object. These distinct functions of gaze and gesture aim to increase system usability, as they are based on human-human interaction techniques, and also help to overcome issues such as the Midas Touch problem that often experienced by look-and-dwell systems. As the dialog combines two modalities in sequence, the gaze interface can be disabled after the first phase. This minimizes the possibility of accidentally selecting objects through the gaze interface. The Midas Touch problem can also be further addressed by ensuring that there is ample dead space between media objects.
—Significantly Reduced Learning Overhead. The CIM aims to reduce the overhead of learning to use the system by encouraging the use of gestures that users can easily associate with activities they perform in their everyday life. This transfer of experience can lead to a smaller learning overhead [Borchers 1997], allowing users to make the most of the system’s features in a shorter time.
- Rachovides, D., Walkerdine, J., and Phillips, P. 2007. The conductor interaction method. ACM Trans. Multimedia Comput. Commun. Appl. 3, 4 (Dec. 2007), 1-23. DOI= http://doi.acm.org/10.1145/1314303.1314312
Gaze and Voice Based Game Interaction (Wilcox et al., 2008)
Their work was presented at the ACM SIGGRAPH 2008 with the associated poster:
- Wilcox, T., Evans, M., Pearce, C., Pollard, N., and Sundstedt, V. 2008. Gaze and voice based game interaction: the revenge of the killer penguins. In ACM SIGGRAPH 2008 Posters (Los Angeles, California, August 11 - 15, 2008).
Thursday, September 18, 2008
The Inspection of Very Large Images by Eye-gaze Control
"The researchers presented novel methods for navigating and inspecting extremely large images solely or primarily using eye gaze control. The need to inspect large images occurs in, for example, mapping, medicine, astronomy and surveillance, and this project considered the inspection of very large aerial images, held in Google Earth. Comparative search and navigation tasks suggest that, while gaze methods are effective for image navigation, they lag behind more conventional methods, so interaction designers might consider combining these techniques for greatest effect." (BCS Interaction)
Abstract
The increasing availability and accuracy of eye gaze detection equipment has encouraged its use for both investigation and control. In this paper we present novel methods for navigating and inspecting extremely large images solely or primarily using eye gaze control. We investigate the relative advantages and comparative properties of four related methods: Stare-to-Zoom (STZ), in which control of the image position and resolution level is determined solely by the user's gaze position on the screen; Head-to-Zoom (HTZ) and Dual-to-Zoom (DTZ), in which gaze control is augmented by head or mouse actions; and Mouse-to-Zoom (MTZ), using conventional mouse input as an experimental control.
The need to inspect large images occurs in many disciplines, such as mapping, medicine, astronomy and surveillance. Here we consider the inspection of very large aerial images, of which Google Earth is both an example and the one employed in our study. We perform comparative search and navigation tasks with each of the methods described, and record user opinions using the Swedish User-Viewer Presence Questionnaire. We conclude that, while gaze methods are effective for image navigation, they, as yet, lag behind more conventional methods and interaction designers may well consider combining these techniques for greatest effect.
Thursday, July 10, 2008
Eye Gaze Interactive Air Traffic Controllers workstation (P.Esser & T.J.J Bos, 2007)
Summary of the thesis "Eye Gaze Interactive ACT workstation"
"Ongoing research is devoted to finding ways to improve performance and reduce workload of Air Traffic Controllers (ATCos) because their task is critical to the safe and efficient flow of air traffic. A new intuitive input method, known as eye gaze interaction, was expected to reduce the work- and task load imposed on the controllers by facilitating the interaction between the human and the ATC workstation. In turn, this may improve performance because the freed mental resources can be devoted to more critical aspects of the job, such as strategic planning. The objective of this Master thesis research was to explore how human computer interaction (HCI) in the ATC task can be improved using eye gaze input techniques and whether this will reduce workload for ATCos.
In conclusion, the results of eye gaze interaction are very promising for selection of aircraft on a radar screen. For entering instructions it was less advantageous. This is explained by the fact that in the first task the interaction is more intuitive while the latter is more a conscious selection task. For application in work environments with large displays or multiple displays eye gaze interaction is considered very promising. "
Download paper as pdf
Tuesday, April 15, 2008
Alea Technologies
A clear usage for their technology is users with disabilities such as ALS. The software contains a "desktop" system which acts as a launcher for other applications (Windows natives, Grid, Cogain). In general, they seem to target Tobii Technologies who have been very successful with their MyTobii application running on the P10 eye tracker. The game is on.
Quote:
Tracking technology | Hybrid infrared video eye- & head-tracking Binocular & monocular tracking |
Working Volume centered at 600 mm distance | 300 x 200 x 200 mm3 [WxHxD] |
Accuracy , static | 0.5°, typical |
Accuracy , over full working volume | 1°, typical |
Sampling Rate | 50 Hz |
Max. head-movement velocity | 15 cm/s |
Recovery-time after tracking loss (head was too fast or moved out of range) | 40 ms |
System Dimensions | ca. 300 x 45 x 80 mm3 [WxHxD] |
Mounting Options | on monitor via VESA-adapter on Tablet-PC via customized interfaces |
System Weight | ca. 1,2 kg. |
Friday, March 7, 2008
Technology: Consumer-grade EEG devices
The company OCZ Technology, mainly known for it's computer components such a memory and power supplies, have announced the "Neural Impulse Actuator (NIA)". While this technology itself is nothing new the novelty lies in the accessibility of the device, priced somewhere around $200-250 when introduced next week.
Check out the quick mini-demo by the guys at Anandtech from the Cebit exhibition in Hannover, 2008
This technical presentation (in German) goes into a bit more detail.
From the press release:
"Recently entering mass production, the final edition of the Neural Impulse Actuator (NIA) will be on display for users to try out the new and electrifying way of playing PC games. The final version of the NIA uses a sleek, metal housing, a USB 2.0 interface, a streamlined headband with carbon interface sensors, and user-friendly software. The NIA is the first commercially available product of its kind, and gamers around the world now have access to this forward-thinking technology that’s had the industry buzzing since its inception."
I've tried research grade EEG devices as means for interaction while at the University of California, San Diego and pure thoughts of actions are hard to detect in a stable manner. It is well known in the neuroscience community that thought of actions activates the same regions in the brain as would be activated by actually performing them. We even have mirror neurons that are activated by observing other people performing goal-directed actions (picking up that banana) The neural activation of pure thought alone is subtle and hard to detect compared to performing actual movements, such as lifting ones arm. So, I do not expect it to be an actual Brain Computer Interface (BCI) capable of detecting thoughts (ie. thinking of kicking etc.) but more a detector of subtle motions in my forehead muscles (eye and eye brown movements, facial expressions etc.)
The firm Emotive has their own version of a consumer grade EEG which is named Epoc NeuroHeadset, it has been around for a little longer and seem to have a more developed software, but still mainly demonstration applications.