Hidenori Watanave at the Tokyo Metropolitan University have released a brief video demonstrating gaze interaction for the Laval VRchive. The VRchive is a virtual reality environment for navigating media content. The setup is using a standalone Tobii 1750 tracker and a projector. The simple interface allows gaze control through looking at either the top, bottom, left or right areas of the display area as well as winking to perform clicks. Althrough an early version the initial experiments were successful, but the software is unstable and needs further improvements.
Sunday, May 3, 2009
Friday, May 1, 2009
Low-Cost Gaze Pointing and EMG Clicking
"Some severely disabled people are excluded from using gaze interaction because gaze trackers are usually expensive (above $10.000). In this paper we present a low-cost gaze pointer, which we have tested in combination with a desktop monitor and a wearable display. It is not as accurate as commercial gaze trackers, and walking while pointing with gaze on a wearable display turned out to be particularly difficult. However, in front of a desktop monitor it is precise enough to support communication. Supplemented with a commercial EMG switch it offers a complete hands-free, gaze-and-click control for less than $200."
- San Agustin, J., Hansen, J. P., Hansen, D. W., and Skovsgaard, H. 2009. Low-cost gaze pointing and EMG clicking. In Proceedings of the 27th international Conference Extended Abstracts on Human Factors in Computing Systems (Boston, MA, USA, April 04 - 09, 2009). CHI EA '09. ACM, New York, NY, 3247-3252. DOI= http://doi.acm.org/10.1145/1520340.1520466
Gaze Controlled Driving
This is the paper on using eye trackers for remote robot navigation I had accepted for the CHI09 conference. It has now appeared on the ACM website. Note that the webcam tracker referred to in the paper is the ITU Gaze Tracker in an earlier incarnation. The main issue while using it is that head movements affect the gaze position and creates an offset. This is easier to correct and counterbalance on a static background than moving image (while driving!)
Abstract
Abstract
"We investigate if the gaze (point of regard) can control a remote vehicle driving on a racing track. Five different input devices (on-screen buttons, mouse-pointing low-cost webcam eye tracker and two commercial eye tracking systems) provide heading and speed control on the scene view transmitted from the moving robot. Gaze control was found to be similar to mouse control. This suggests that robots and wheelchairs may be controlled ―hands-free‖ through gaze. Low precision gaze tracking and image transmission delays had noticeable effect on performance."
- Tall, M., Alapetite, A., San Agustin, J., Skovsgaard, H. H., Hansen, J. P., Hansen, D. W., and Møllenbach, E. 2009. Gaze-controlled driving. In Proceedings of the 27th international Conference Extended Abstracts on Human Factors in Computing Systems (Boston, MA, USA, April 04 - 09, 2009). CHI EA '09. ACM, New York, NY, 4387-4392. DOI= http://doi.acm.org/10.1145/1520340.1520671
0
comments
Labels:
gazetracker,
ITU,
navigation,
robot
Monday, April 27, 2009
ITU Gaze Tracker: Low-cost gaze interaction: ready to deliver the promises (San Agustin, J et al., 2009)
The research paper on the ITU Gaze Tracker that Javier San Agustin presented at CHI09 is now available at the ACM website. It evaluates a previous version of the gaze tracker in two tasks, target acquisition and eye typing in comparison with mouse, SMI IViewX RED and the Tobii 1750.
Abstract
"Eye movements are the only means of communication for some severely disabled people. However, the high prices of commercial eye tracking systems limit the access to this technology. In this pilot study we compare the performance of a low-cost, web cam-based gaze tracker that we have developed with two commercial trackers in two different tasks: target acquisition and eye typing. From analyses on throughput, words per minute and error rates we conclude that a low-cost solution can be as efficient as expensive commercial systems."
Abstract
"Eye movements are the only means of communication for some severely disabled people. However, the high prices of commercial eye tracking systems limit the access to this technology. In this pilot study we compare the performance of a low-cost, web cam-based gaze tracker that we have developed with two commercial trackers in two different tasks: target acquisition and eye typing. From analyses on throughput, words per minute and error rates we conclude that a low-cost solution can be as efficient as expensive commercial systems."
San Agustin, J., Skovsgaard, H., Hansen, J. P., and Hansen, D. W. 2009. Low-cost gaze interaction: ready to deliver the promises. In Proceedings of the 27th international Conference Extended Abstracts on Human Factors in Computing Systems (Boston, MA, USA, April 04 - 09, 2009). CHI EA '09. ACM, New York, NY, 4453-4458.
Download at ACM website.
0
comments
Labels:
gazetracker,
ITU,
typing
A brief users guide to the ITU Gaze Tracker
Today we release a short users guide for the open source eye tracker we presented some weeks ago. Hopefully it will assist first time users to configure the software and understanding the limitations of the initial version. Comments and suggestions appreciated.
- Download the pdf document (0.6Mb)
2
comments
Labels:
eye tracker,
gazetracker,
ITU,
low cost
Friday, April 17, 2009
IDG Interview with Javier San Agustin
During the CHI09 in Boston last week Nick Barber from the IDG Network stopped by to record an interview with Javier San Agustin, member of the ITU GazeGroup. The video has now surfaced on several IDG sites around the world, clearly there is an interest for easy to use, low cost eye tracking. After the initial release of ITU Gaze Tracker we have setup a community forum at forum.gazegroup.org, with the ambition to connect users of open source eye tracking. If you like to be part of project, please join in promoting and developing an alternative. It´s open and accessible for all (platform documentation to be released in next week)
Hopefully, ideas and contributions to platform through the community makes the platform take off. Considering the initial release to be a Beta version, there are of course additional improvements to make. Additional cameras needs to be verified and bugs in code to be handled.
If you experience any issues or have ideas for improvements please post at http://forum.gazegroup.org
Hopefully, ideas and contributions to platform through the community makes the platform take off. Considering the initial release to be a Beta version, there are of course additional improvements to make. Additional cameras needs to be verified and bugs in code to be handled.
If you experience any issues or have ideas for improvements please post at http://forum.gazegroup.org
0
comments
Labels:
eye tracker,
gazetracker,
ITU,
low cost,
open source
Sunday, April 5, 2009
Introducing the ITU GazeTracker
The ITU Gaze Tracker is an open-source eye gaze tracking application that aims to provide a low-cost alternative to commercial gaze tracking systems and thereby making the technology more accessible. It is being developed by the Gaze Group at the IT University of Copenhagen, supported by the Communication by Gaze Interaction Association (COGAIN). The eye tracking software is video-based, and any camera equipped with infrared nightvision can be used, such as a video camera or a webcam. The cameras that have been tested with the system can be found in our forum.
Features:- Supports head mounted and remote setups
- Tracks both pupil and glints
- Supports a wide variety of camera devices
- Configurable calibration
- Eye-mouse capabilities
- UDPServer broadcasting gaze data
- Full source code provided
We encourage users and developers to test our software with their cameras and provide feedback so we can continue development. The ITU Gaze Tracker is released under the GLP3 open source license and the full source code is hosted at sourceforge. It´s written in C# using Emgu OpenCV wrapper for C++ image processing. (Microsoft .Net 3.5 needed) Once the tracker has been started it can be configured to broadcast gaze data via the UDP protocol which makes it easy to pick up in your own applications. We provide a sample implementation on a client in C#.
Open source eye tracking has never been easier. Download the binaries, plug the camera and launch the application. Adjust the sliders to match your camera and start the calibration.
Visit the ITU GazeGroup to download the software package. Please get in touch with us at http://forum.gazegroup.org
Visit the ITU GazeGroup to download the software package. Please get in touch with us at http://forum.gazegroup.org
0
comments
Labels:
eye tracker,
gazetracker,
low cost,
open source
Tuesday, March 31, 2009
Radio interview with DR1
Thomas Behrndtz from the Danish Radio (DR1) came by the other day to do an interview on the upcoming ITU Gaze Interaction platform. It resulted in a five minute episode on the "Videnskaben kort", a radio program on interesting progress in science. Lately we have been working hard on the software package which is to be released at CHI09 in Boston next week. It includes a number of applications and tools that are to be released for free download including source code under the GPL licence. In short, these are exciting times for low-cost eye tracking and gaze interaction. Stay tuned..
0
comments
Labels:
gazetracker,
ITU,
low cost,
open source
Thursday, March 12, 2009
The Argentinian myEye released
I have been following an interesting project taking place in Argentina during the last half year. Marcelo Laginestra have through his blog described the developments of a low-cost webcam based eye tracker. It has now been released for download, free of charge.
The system requirements are modest,
- CPU: 1.5 Ghz or higher
- RAM: 256 DDR RAM or higher (Recommendation 512 RAM)
- Space: at least 100MB hard disk space.
- Camera: 640x480 capture resolution or higher. (At least 30fps)
- O.S.: Microsoft Windows XP SP2
I am happy to see that the project came through, kudos for releasing under Creative Commons.
Keep an eye open for the ITU gaze interaction platform that will be released in conjunction with CHI09 in early April.
0
comments
Labels:
eye tracker,
low cost,
prototype
Monday, February 16, 2009
ID-U Biometrics: Eye movement based access control
Daphna Palti-Wasserman and Yoram Wasserman at ID-U Biometrics have developed a system which provides secure signatures to access control based on individual eye movement patterns. The subject’s response to a dynamic stimuli provides an unique characteristics. As the stimuli will change the subjects’ responses will be different each time but the pattern of eye movements and the users eye characteristics will remains the same. This results in a "code" which is not entered and not consciously controlled by the user which reduces issues of spoofing. Currently its in a proof-of-concept state, achieving a 100% accurate and stable eye tracking method which would be required for identification has yet to be achieved (by any eye tracking platform that is) However, this method of user identification could be applied in other situations than the ATM (I guess that's why they won the GLOBES start-up competition)
- Website: http://idu-biometrics.com/
- Isreal21C Business startup magazine. "You'll never be me, with Israel's ID-U cyber identification"
Subscribe to:
Posts (Atom)