Monday, July 21, 2008
SMI Experiment Suite 360
Tuesday, July 15, 2008
Sebastian Hillaire at IRISA Rennes, France
Automatic, Real-Time, Depth-of-Field Blur Effect for First-Person Navigation in Virtual Environment (2008)
"We studied the use of visual blur effects for first-person navigation in virtual environments. First, we introduce new techniques to improve real-time Depth-of-Field blur rendering: a novel blur computation based on the GPU, an auto-focus zone to automatically compute the user’s focal distance without an eye-tracking system, and a temporal filtering that simulates the accommodation phenomenon. Secondly, using an eye-tracking system, we analyzed users’ focus point during first-person navigation in order to set the parameters of our algorithm. Lastly, we report on an experiment conducted to study the influence of our blur effects on performance and subjective preference of first-person shooter gamers. Our results suggest that our blur effects could improve fun or realism of rendering, making them suitable for video gamers, depending however on their level of expertise."
- Sébastien Hillaire, Anatole Lécuyer, Rémi Cozot, Géry Casiez
Automatic, Real-Time, Depth-of-Field Blur Effect for First-Person Navigation in Virtual Environment. To appear in IEEE Computer Graphics and Application (CG&A), 2008 , pp. ??-?? Source code (please refer to my IEEE VR 2008 publication)
"We describes the use of user’s focus point to improve some visual effects in virtual environments (VE). First, we describe how to retrieve user’s focus point in the 3D VE using an eye-tracking system. Then, we propose the adaptation of two rendering techniques which aim at improving users’ sensations during first-person navigation in VE using his/her focus point: (1) a camera motion which simulates eyes movement when walking, i.e., corresponding to vestibulo-ocular and vestibulocollic reflexes when the eyes compensate body and head movements in order to maintain gaze on a specific target, and (2) a Depth-of-Field (DoF) blur effect which simulates the fact that humans perceive sharp objects only within some range of distances around the focal distance.
Second, we describe the results of an experiment conducted to study users’ subjective preferences concerning these visual effects during first-person navigation in VE. It showed that participants globally preferred the use of these effects when they are dynamically adapted to the focus point in the VE. Taken together, our results suggest that the use of visual effects exploiting users’ focus point could be used in several VR applications involving firstperson navigation such as the visit of architectural site, training simulations, video games, etc."
Sébastien Hillaire, Anatole Lécuyer, Rémi Cozot, Géry Casiez
Using an Eye-Tracking System to Improve Depth-of-Field Blur Effects and Camera Motions in Virtual Environments. Proceedings of IEEE Virtual Reality (VR) Reno, Nevada, USA, 2008, pp. 47-51. Download paper as PDF.
QuakeIII DoF&Cam sources (depth-of-field, auto-focus zone and camera motion algorithms are under GPL with APP protection)
Passive eye tracking while playing Civilization IV
Thursday, July 10, 2008
Eye Gaze Interactive Air Traffic Controllers workstation (P.Esser & T.J.J Bos, 2007)
Summary of the thesis "Eye Gaze Interactive ACT workstation"
"Ongoing research is devoted to finding ways to improve performance and reduce workload of Air Traffic Controllers (ATCos) because their task is critical to the safe and efficient flow of air traffic. A new intuitive input method, known as eye gaze interaction, was expected to reduce the work- and task load imposed on the controllers by facilitating the interaction between the human and the ATC workstation. In turn, this may improve performance because the freed mental resources can be devoted to more critical aspects of the job, such as strategic planning. The objective of this Master thesis research was to explore how human computer interaction (HCI) in the ATC task can be improved using eye gaze input techniques and whether this will reduce workload for ATCos.
In conclusion, the results of eye gaze interaction are very promising for selection of aircraft on a radar screen. For entering instructions it was less advantageous. This is explained by the fact that in the first task the interaction is more intuitive while the latter is more a conscious selection task. For application in work environments with large displays or multiple displays eye gaze interaction is considered very promising. "
Download paper as pdf
Wednesday, July 9, 2008
GazeTalk 5
Information about Gazetalk 5 eye communication system
GazeTalk is a predictive text entry system that has a restricted on-screen keyboard with ambiguous layout for severely disabled people. The main reason for using such a keyboard layout is that it enables the use of an eye tracker with a low spatial resolution (e.g., a web-camera based eye tracker).
The goal of the GazeTalk project is to develop an eye-tracking based AAC system that supports several languages, facilitates fast text entry, and is both sufficiently feature-complete to be deployed as the primary AAC tool for users, yet sufficiently flexible and technically advanced to be used for research purposes. The system is designed for several target languages, initially Danish, English, Italian, German and Japanese.
Main features
- type-to-talk
- writing
- web – browser
- Multimedia – player
- PDF – reader
- letter and word prediction, and word completion
- speech output
- can be operated by gaze, headtracking, mouse, joystick, or any other pointing device
- supports step-scanning (new!)
- supports users with low precision in their movements, or trackers with low accuracy
- allows the user to use Dasher inside GazeTalk and to transfer the text written in Dasher back to GazeTalk
GazeTalk 5.0 has been designed and developed by the Eye Gaze Interaction Group at the IT University of Copenhagen and the IT-Lab at the Royal School of Library and Information Science, Copenhagen.
Read more About Gazetalk or view GazeTalk manual
Short manual on data recording in GazeTalk
GazeTalk Videos
- Introduction to GazeTalk and its features
Click here to play the video - Watch the video "Introduction to GazeTalk and its features" (same as above!) in Windows Media File (wmv) format (7.5 MB).
- Watch a YouTube video on Gaze Interaction for People with ALS/MND .
- Watch another YouTube video on a Talk Given with the Eyes Only by Arne Lykke Larsen
- Watch a YouTube video on ALS-Communication and GazeTalk , where the developer, Dr. John Paulin Hansen explains how GazeTalk works (in Danish, subtitles in English). The video includes short clips of using GazeTalk and a brief interview of John Paulin Hansen. The video was produced by www.synvision.dk, intiative: Birger Bergmann Jeppesen (bigerbj (at) webspeed (dot) dk), for more information, see www.als-communication.dk)
Eye tracking using a webcamera
eye tracking from Hubert Wassner on Vimeo.
eye-tracking V2 from Hubert Wassner on Vimeo.
eye command from Hubert Wassner on Vimeo.
More information:
Hubert Wassner Blog (prof. of Comp.Sci) French / English (automatic translation)
Sunday, July 6, 2008
Eye tracking in space
"The working hypothesis is that in microgravity the orientation of Listings Plane is altered, probably to a small and individually variable degree. Further, with the loss of the otolith-mediated gravitational reference, it is expected that changes in the orientation of the coordinate framework of the vestibular system occur, and thus a divergence between Listing?s Plane and the vestibular coordinate frame should be observed. While earlier ground-based experiments indicate that Listing?s Plane itself is to a small degree dependent on the pitch orientation to gravity, there is more compelling evidence of an alteration of the orientation of the vestibulo-ocular reflex (VOR), reflex eye movement that stabilizes images on the retina during head movement by producing an eye movement in the direction opposite to head movement, thus preserving the image on the center of the visual field, in microgravity.
Furthermore, changes in bodily function with relation to eye movement and spatial orientation that occur during prolonged disturbance of the vestibular system most likely play a major role in the problems with balance that astronauts experience following re-entry from space.
In view of the much larger living and working space in the ISS, and the extended program of spacewalks (EVAs) being planned, particular care must be given to assessing the reliability of functions related to eye movement and spatial orientation.
The performance of the experiments in space are therefore of interest for their expected contribution to basic research knowledge and to the improvement and assurance of human performance under weightless conditions."
NASA Image: ISS011E13710 - Cosmonaut Sergei K. Krikalev, Expedition 11 Commander representing Russia's Federal Space Agency, uses the Eye Tracking Device (ETD), a European Space Agency (ESA) payload in the Zvezda Service Module of the International Space Station. The ETD measures eye and head movements in space with great accuracy and precision.
"The ETD consists of a headset that includes two digital camera modules for binocular recording of horizontal, vertical and rotational eye movements and sensors to measure head movement. The second ETD component is a laptop PC, which permits digital storage of all image sequences and data for subsequent laboratory analysis. Listing's Plane can be examined fairly simply, provided accurate three-dimensional eye-in-head measurements can be made. Identical experimental protocols will be performed during the pre-flight, in-flight and post-flight periods of the mission. Accurate three-dimensional eye-in-head measurements are essential to the success of this experiment. The required measurement specifications (less than 0.1 degrees spatial resolution, 200 Hz sampling frequency) are fulfilled by the Eye Tracking Device (ETD)."
More information:
http://www.nasa.gov/mission_pages/station/science/experiments/ETD.html
http://www.spaceflight.esa.int/delta/
http://www.energia.ru/eng/iss/researches/medic-65.html
Realtime computer interaction via eye tracking (Dubey, 2004)
Abstract
"This thesis presents a computer vision-based eye tracking system for human computer interaction. The eye tracking system allows the user to indicate a region of interest in a large data space and to magnify that area, without using traditional pointer devices. Presented is an iris tracking algorithm adapted from Camshift; an algorithm originally designed for face or hand tracking. Although the iris is much smaller and highly dynamic. the modified Camshift algorithm efficiently tracks the iris in real-time. Also presented is a method to map the iris centroid, in video coordinates to screen coordinates; and two novel calibration techniques, four point and one-point calibration. Results presented show that the accuracy for the proposed one-point calibration technique exceeds the accuracy obtained from calibrating with four points. The innovation behind the one-point calibration comes from using observed eye scanning behaviour to constrain the calibration process. Lastly, the thesis proposes a non-linear visualisation as an eye-tracking application, along with an implementation."
Download paper as PDF.
Thursday, July 3, 2008
Low cost eye tracking
Considering the low resolution of the camera and the simplicity of the setup the results are noteworthy. Hope to see more development of this!
Wednesday, July 2, 2008
Hot Zone prototype (Wakaruru)
Submitted to YouTube by "Wakaruru"