Showing posts with label eye tracker. Show all posts
Showing posts with label eye tracker. Show all posts

Sunday, April 25, 2010

Wednesday, April 14, 2010

Open-source gaze tracker awarded Research Pearls of ITU Copenhagen

The open-source eye tracker ITU Gaze Tracker primarily developed by Javier San Augustin, Henrik Skovsgaard and myself has been awarded the Research Pearls of the IT University of Copenhagen. A presentation will be held at ITU on May 6th at 2pm. The software released one year ago have seen more than 5000 downloads by students and hobbyist around the world. It's rapidly approaching a new release which will offer better performance and stability for remote tracking and many bug fixes in general. The new version adds support for a whole range of new HD web cameras. These provides a vastly improved image quality that finally brings hope for a low-cost, open, flexible and reasonably performing solution. The ambitious goal strives to make eye tracking technology available for everyone, regardless of available resources. Follow the developments at the forum. Additional information is available at the ITU Gaze Group.

"The Open-Source ITU Gaze Tracker"

Abstract:
Gaze tracking offers them the possibility of interacting with a computer by just using eye movements, thereby making users more independent. However, some people (for example users with a severe disability) are excluded from access to gaze interaction due to the high prices of commercial systems (above 10.000€). Gaze tracking systems built from low-cost and off-the-shelf components have the potential of facilitating access to the technology and bring prices down.

The ITU Gaze Tracker is an off-the-shelf system that uses an inexpensive web cam or a video camera to track the user’s eye. It is free and open-source, offering users the possibility of trying out gaze interaction technology for a cost as low as 20€, and to adapt and extend the software to suit specific needs.

In this talk we will present the open-source ITU Gaze Tracker and show the different scenarios in which the system has been used and evaluated.

Monday, March 29, 2010

Low-cost eye tracking and pong gaming from Imperial College London

A group of students at the Imperial College London have develop a low-cost head mounted tracker which they use to play Pong with. The work is carried out under supervision of Aldo Faisal in his lab.

"
We built an eyetracking system using mass-marketed off-the shelf components at 1/1000 of that cost, i.e. for less then 30 GBP. Once we made such a system that cheap we started thinking of it as a user interface for everyday use for impaired people.. The project was enable by realising that certain mass-marketed web cameras for video game consoles offer impressive performance approaching that of much more expensive research grade cameras.



"From this starting point research in our group has focussed on two parts so far:


1. The TED software, which is composed of two components which can run on two different computers (connected by wireless internet) or run on the same computer. The first component is the TED server (Linux-based) which interfaces directly with the cameras and processes the high-speed video feed and makes the data available (over the internet) to the client software. The client forms the second components, it is written in Java (i.e. it runs on any computer, Windows, Mac, Unix, ...) and provides the Mouse-control-via-eye-movements, the “Pong” video game as well as configuration and calibration functions.

This two part solution allows the cameras to be connected to a cost-effective netbook (e.g. on a wheel chair) and allow control of other computers over the internet (e.g. in the living room, office and kitchen). This software suite, as well as part of the low-level camera driver was implemented by Ian Beer, Aaron Berk, Oliver Rogers and Timothy Treglown, for their undergraduate project in the lab.

Note:the “Pong” video game has a two player mode, allowing two people to play against each other using two eye-trackers or eye-tracker vs keyboard. It is very easy to use, just look where you want the pong paddle to move...

2. The camera-spectacles (visible in most press photos), as well as a two-camera software (Windows-based) able to track eye-movements in 3D (i.e. direction and distance) for wheelchair control. These have been build and developed by William Abbott (Dept. of Bioengineering)."

Further reading:

Imperial College London press release: Playing “Pong” with the blink of an eye
The Engineer: Eye-movement game targets disabled
Engadget (German): Neurotechnologie: Pong mit Augenblinzeln gespielt in London

Friday, March 19, 2010

In the Eye of the Beholder: A Survey of Models for Eyes and Gaze (Hansen&Ji, 2010)

The following paper by Dan Witzner Hansen from ITU Copenhagen and Qiang Ji of Rensselaer Polytechnic Institute surveys and summarizes most existing methods of eye tracking and explains how they operate and what the pros/cons of each methods are. It is one of the most comprehensive publication I've seen on the topic and a delight to read. It was featured in the March edition of IEEE Transactions on Pattern Analysis and Machine Intelligence.

Abstract
"Despite active research and significant progress in the last 30 years, eye detection and tracking remains challenging due to the individuality of eyes, occlusion, variability in scale, location, and light conditions. Data on eye location and details of eye movements have numerous applications and are essential in face detection, biometric identification, and particular human-computer interaction tasks. This paper reviews current progress and state of the art in video-based eye detection and tracking in order to identify promising techniques as well as issues to be further addressed. We present a detailed review of recent eye models and techniques for eye detection and tracking. We also survey methods for gaze estimation and compare them based on their geometric properties and reported accuracies. This review shows that, despite their apparent simplicity, the development of a general eye detection technique involves addressing many challenges, requires further theoretical developments, and is consequently of interest to many other domains problems in computer vision and beyond."

Comparison of gaze estimation methods with respective prerequisites and reported accuracies


Eye Detection models

  • Dan Witzner Hansen, Qiang Ji, "In the Eye of the Beholder: A Survey of Models for Eyes and Gaze," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 32, no. 3, pp. 478-500, Jan. 2010, doi:10.1109/TPAMI.2009.30. Download as PDF.

Friday, January 8, 2010

Mobile Dias Eye Tracker

Remember the Dias Eyetracker that I wrote about last May? Today Diako Mardanbeigi, from Tehran in Iran, presents a new version of the Dias eye tracker that is low-cost, wireless and fully mobile. I'll let the video demonstration below speak for itself. Rumor has it that Dias has been in contact with the ITU GazeGroup for a potential continuation of his research. Time will tell.



"This is a low cost mobile eye tracker with a wireless and Light weight head mounted hardware. This system gathers eye movements and estimates the point of gaze during the performance of daily tasks. It can let you to assess the visual behavior of the person online and in real-time when he is doing a specific task. A mobile eye tracker has a wide variety of applications in several fields such as human factors, market research, consumer shopping behavior, sports, driving, reading, safety & training. "

Friday, December 11, 2009

PhD Defense: Off-the-Shelf Gaze Interaction

Javier San Agustin will defend his PhD thesis on "Off-the-Shelf Gaze Interaction" at the IT University of Copenhagen on the 8th of January from 13.00 to (at most) 17.00. The program for the event consists of a one hour presentation which is followed by a discussion with the committee, formed by Andrew Duchowski, Bjarne Kjær Ersbøll, and Arne John Glenstrup. Whereby a traditional reception with snacks and drinks will be held.

Update: The thesis is now available as PDF, 179 pages, 3.6MB.

Abstract of the thesis:


People with severe motor-skill disabilities are often unable to use standard input devices such as a mouse or a keyboard to control a computer and they are, therefore, in strong need for alternative input devices. Gaze tracking offers them the possibility to use the movements of their eyes to interact with a computer, thereby making them more independent. A big effort has been put toward improving the robustness and accuracy of the technology, and many commercial systems are nowadays available in the market.

Despite the great improvements that gaze tracking systems have undergone in the last years, high prices have prevented gaze interaction from becoming mainstream. The use of specialized hardware, such as industrial cameras or infrared light sources, increases the accuracy of the systems, but also the price, which prevents many potential users from having access to the technology. Furthermore, the different components are often required to be placed in specific locations, or are built into the monitor, thus decreasing the flexibility of the setup.

Gaze tracking systems built from low-cost and off-the-shelf components have the potential to facilitate access to the technology and bring the prices down. Such systems are often more flexible, as the components can be placed in different locations, but also less robust, due to the lack of control over the hardware setup and the lower quality of the components compared to commercial systems.

The work developed for this thesis deals with some of the challenges introduced by the use of low-cost and off-the-shelf components for gaze interaction. The main contributions are:
  • Development and performance evaluation of the ITU Gaze Tracker, an off-the-shelf gaze tracker that uses an inexpensive webcam or video camera to track the user's eye. The software is readily available as open source, offering the possibility to try out gaze interaction for a low price and to analyze, improve and extend the software by modifying the source code.
  • A novel gaze estimation method based on homographic mappings between planes. No knowledge about the hardware configuration is required, allowing for a flexible setup where camera and light sources can be placed at any location.
  • A novel algorithm to detect the type of movement that the eye is performing, i.e. fixation, saccade or smooth pursuit. The algorithm is based on eye velocity and movement pattern, and allows to smooth the signal appropriately for each kind of movement to remove jitter due to noise while maximizing responsiveness.

Tuesday, November 24, 2009

Remote tracker and 6DOF using a webcam

The following video clips demonstrates a Masters thesis project from the AGH University of Science and Technology in Cracow, Poland. The method developed provides 6 degrees of freedom head tracking and 2D eye tracking using a simple, low resolution 640x480 webcam. Under the hood it's based on the Lucas-Kanade optical flow and POSIT. A great start as the head tracking seems relatively stable. Imagine it with IR illumination, a camera with slightly higher resolution and a narrow angle lens. And of course, pupil + glint tracking algorithms for calibrated gaze estimation.


Wednesday, October 21, 2009

Nokia near-eye display gaze interaction update

The Nokia near-eye gaze interaction platform that I tried in Finland last year has been further improved. The cap used to support the weight has been replaced with a sturdy frame and the overall prototype seems lighter and also incorporates headphones. The new gaze based navigation interface support photo browsing based on the Image Space application, allowing location based accesses to user generated content. See the concept video at the bottom for their futuristic concept. Nokia research website. The prototype will be displayed at the International Symposium on Mixed and Augmented Reality conference in Orlando, October 19-22.






Friday, September 18, 2009

The EyeWriter project

For some time I've been following the EyeWriter project which aims at enabling Tony, who has ALS, to draw graffiti using eye gaze alone. The open source eye tracker is available at Google code and is based on C++, OpenFrameworks and OpenCV. The current version supports basic pupil tracking based on image thresholding and blob detection but they are aiming for remote tracking using IR glints. Keep up the great work guys!

The Eyewriter from Evan Roth on Vimeo.

eyewriter tracking software walkthrough from thesystemis on Vimeo.

More information is found at http://fffff.at/eyewriter/

Thursday, August 20, 2009

A geometric approach to remote eye tracking (Villanueva et al, 2009)

Came across this paper today, it's good news and a great achievement, especially since consumer products for recording high definition over a plain USB port has begun to appear. For example the upcoming Microsoft Lifecam Cinema HD provides 1,280 x 720 at 30 frames per second. This is to be released on September 9th at a reasonable US$ 80. Hopefully it will allow a simple modification to remove the infrared blocking filter. Things are looking better and better for low-cost eye tracking, keep up the excellent work, it will make a huge difference for all of us.

Abstract
"This paper presents a principled analysis of various combinations of image features to determine their suitability for remote eye tracking. It begins by reviewing the basic theory underlying the connection between eye image and gaze direction. Then a set of approaches is proposed based on different combinations of well-known features and their behaviour is valuated, taking into account various additional criteria such as free head movement, and minimum hardware and calibration requirements. The paper proposes a final method based on multiple glints and the pupil centre; the method is evaluated experimentally. Future trends in eye tracking technology are also discussed."


The algorithms were implemented in C++ running on a Windows PC equipped with a Pentium 4 processor at 3 GHz and 1 GB of Ram. The camera of choice delivers 15 frames per second at 1280 x 1024. Optimal distance from screen is 60 cm which is rather typical for remote eye trackers. This provides a track-box volume of 20 x 20 x 20 cm. Within this area the algorithms produce an average accuracy of 1.57 degrees. A 1 degree accuracy may be achieved obtained if the head is the same position as it was during calibration. Moving the head parallel to the monitor plane increases error by 0.2 - 0.4 deg. while moving closer or further away introduces a larger error between 1-1.5 degrees (mainly due to camera focus range). Note that no temporal filtering was used in the reporting. All-in-all these results are not so far from what typical remote systems produce.


The limitation of 15 fps stems from the frame rate of the camera, the software itself is able to process +50 images per second on the specified machine. Leaving it to our imagination what frame rates may be achieved with a fast Intel Core i7 processor with four cores.


  • A. Villanueva, G. Daunys, D. Hansen, M. Böhme, R. Cabeza, A. Meyer, and E. Barth, "A geometric approach to remote eye tracking," Universal Access in the Information Society. [Online]. Available: http://dx.doi.org/10.1007/s10209-009-0149-0

Tuesday, August 11, 2009

ALS Society of British Columbia announces Engineering Design Awards (Canadian students only)

"The ALS Society of British Columbia has established three Awards to encourage and recognize innovation in technology to substantially improve the quality of life of people living with ALS (Amyotrophic Lateral Sclerosis, also known as Lou Gehrig’s Disease). Students at the undergraduate or graduate level in engineering or a related discipline at a post-secondary institution in British Columbia or elsewhere in Canada are eligible for the Awards. Students may be considered individually or as a team. Mentor Awards may also be given to faculty supervising students who win awards" (see Announcement)


Project ideas:
  • Low-cost eye tracker
    • Issue: Current commercial eye-gaze tracking systems cost thousands to tens of thousands of dollars. The high cost of eye-gaze trackers prevents potential users from accessing eye- gaze tracking tools. The hardware components required for eye-gaze tracking do not justify the price and a lower-cost alternative is desirable. Webcams may be used for low-cost imaging, along with simple infrared diodes for system lighting. Alternatively, visible light systems may also be investigated. Opensource eye-gaze tracking software is also available. (ed: ITU GazeTracker, OpenEyes, Track Eye, OpenGazer and MyEye (free, no source)
    • Goal: The goal of this design project is to develop a low-cost and usable eye-gaze tracking system based on simple commercial-of-the-shelf hardware.
    • Deliverables: A working prototype of a functional, low-cost (< $200), eye-gaze tracking system.
  • Eye-glasses compensation
    • Deliverables: A working prototype of a functional, low-cost (< $200), eye-gaze tracking system
    • Issue: The use of eye-glasses can cause considerable problems in eye-gaze tracking. The issue stems from reflections off the eye-glasses due to the use of controlled infrared lighting (on and off axis light sources) used to highlight features of the face. The key features of interest are the pupils and glints (or reflections of the surface of the cornea). Incorrectly identifying the pupils and glints then results in invalid estimation of the point-of-gaze.
    • Goal: The goal of this design project is to develop techniques for either: 1) avoiding image corruption with eye-glasses on a commercial eye-gaze tracker, or 2) developing a controlled lighting scheme to ensure valid pupil and glints identification are identified in the presence of eye-glasses.
    • Deliverables: Two forms of deliverables are possible: 1) A working prototype illustrating functional eye-gaze tracking in the presence of eye-glasses with a commercial eye-gaze tracker, or 2) A working prototype illustrating accurate real-time identification of the pupil and glints using controlled infrared lighting (on and off axis light sources) in the presence of eye-glasses.
  • Innovative selection with ALS and eye gaze
    • Issue: As mobility steadily decreases in the more advanced stages of ALS, alternative techniques for selection are required. Current solutions include head switches, sip and puff switches and dwell time activation depending on the degree of mobility loss to name a few. The use of dwell time requires no mobility other than eye motion, however, this technique suffers from ‘lag’ in that the user must wait the dwell time duration for each selection, as well as the ‘midas touch’ problem in which unintended selection if the gaze point is stationary for too long.
    • Goal: The goal of this design project is to develop a technique for improved selection with eye-gaze for individuals with only eye-motion available. Possible solutions may involve novel HCI designs for interaction, including various adaptive and predictive technologies, the consideration of contextual cues, and the introduction of ancillary inputs, such as EMG, EEG.
    • Deliverables: A working prototype illustrating eye-motion only selection with a commercial eye-gaze tracking system.
  • Novel and valuable eye-gaze tracking applications and application enhancements
    • Issue: To date, relatively few gaze-tracking applications have been developed. These include relatively simplistic applications such as the tedious typing of words, and even in such systems, little is done to ease the effort required, e.g., systems typically do not allow for the saving and reuse of words and sentences.
    • Goal: The goal of this design project is to develop one or more novel applications or application enhancements that take gaze as input, and that provide new efficiencies or capabilities that could significantly improve the quality of life of those living with ALS.
    • Deliverables: A working prototype illustrating one or more novel applications that take eye-motion as an input. The prototype must be developed and implemented to the extent that an evaluation of the potential efficiencies and/or reductions in effort can be evaluated by persons living with ALS and others on an evaluation panel.

    See the Project Ideas for more information. For contact information see page two of the announcement.

Friday, July 31, 2009

SMI RED 250!


Today SMI announced the new RED250 which, as the name suggests, has an impressive 250Hz sampling rate. It has an accuracy of 0.5 degrees or below (typ.), less than 10 ms. latency and operates within 60-80 cm head distance. The track-box is 40x40 cm at 70cm distance and will recover tracking faster than the previous model. No details on pricing yet but top of the line performance never comes cheap. Get the flyer as pdf.

Wednesday, May 6, 2009

The Dias Eye Tracker (Mardanbeigi, 2009)

Diako Mardanbeigi at the Iran University of Science & Technology introduces the Dias Eye Tracking suite. It is a low-cost solution employing a head mounted setup and comes with a rather extensive suite of applications. The software offers gaze control for playing games and music, viewing images, and text-to-speech using a dwell keyboard. It also offers basic eye movement recording and visualization such as scanpaths. The software is built using Visual Basic 6 and implements various algorithms for eye tracking including a rectangular method, RANSAC or LSQ ellipse/circle fitting. Additionally, there is support tracking one or two glints. The following video demonstrates the hardware and software. Congratulations Daiko on this great work!


Monday, April 27, 2009

A brief users guide to the ITU Gaze Tracker

Today we release a short users guide for the open source eye tracker we presented some weeks ago. Hopefully it will assist first time users to configure the software and understanding the limitations of the initial version. Comments and suggestions appreciated.


Friday, April 17, 2009

IDG Interview with Javier San Agustin

During the CHI09 in Boston last week Nick Barber from the IDG Network stopped by to record an interview with Javier San Agustin, member of the ITU GazeGroup. The video has now surfaced on several IDG sites around the world, clearly there is an interest for easy to use, low cost eye tracking. After the initial release of ITU Gaze Tracker we have setup a community forum at forum.gazegroup.org, with the ambition to connect users of open source eye tracking. If you like to be part of project, please join in promoting and developing an alternative. It´s open and accessible for all (platform documentation to be released in next week)

Hopefully, ideas and contributions to platform through the community makes the platform take off. Considering the initial release to be a Beta version, there are of course additional improvements to make. Additional cameras needs to be verified and bugs in code to be handled.

If you experience any issues or have ideas for improvements please post at http://forum.gazegroup.org



Computerworld.com.au

WebWereld.nl

PCAdvisor.co.uk

TechWorld.nl

IDG.no/ComputerWorld

ComputerWorld.dk

ComputerWorld.hu

ARNnet.com.au

Sunday, April 5, 2009

Introducing the ITU GazeTracker

The ITU Gaze Tracker is an open-source eye gaze tracking application that aims to provide a low-cost alternative to commercial gaze tracking systems and thereby making the technology more accessible. It is being developed by the Gaze Group at the IT University of Copenhagen, supported by the Communication by Gaze Interaction Association (COGAIN). The eye tracking software is video-based, and any camera equipped with infrared nightvision can be used, such as a video camera or a webcam. The cameras that have been tested with the system can be found in our forum.

Features:
  • Supports head mounted and remote setups
  • Tracks both pupil and glints
  • Supports a wide variety of camera devices
  • Configurable calibration
  • Eye-mouse capabilities
  • UDPServer broadcasting gaze data
  • Full source code provided


We encourage users and developers to test our software with their cameras and provide feedback so we can continue development. The ITU Gaze Tracker is released under the GLP3 open source license and the full source code is hosted at sourceforge. It´s written in C# using Emgu OpenCV wrapper for C++ image processing. (Microsoft .Net 3.5 needed) Once the tracker has been started it can be configured to broadcast gaze data via the UDP protocol which makes it easy to pick up in your own applications. We provide a sample implementation on a client in C#.

Open source eye tracking has never been easier. Download the binaries, plug the camera and launch the application. Adjust the sliders to match your camera and start the calibration.

Visit the ITU GazeGroup to download the software package. Please get in touch with us at http://forum.gazegroup.org

Thursday, March 12, 2009

The Argentinian myEye released

I have been following an interesting project taking place in Argentina during the last half year. Marcelo Laginestra have through his blog described the developments of a low-cost webcam based eye tracker. It has now been released for download, free of charge.

The system requirements are modest,
  • CPU: 1.5 Ghz or higher
  • RAM: 256 DDR RAM or higher (Recommendation 512 RAM)
  • Space: at least 100MB hard disk space.
  • Camera: 640x480 capture resolution or higher. (At least 30fps)
  • O.S.: Microsoft Windows XP SP2
Go to the myEye website to download the software.

I am happy to see that the project came through, kudos for releasing under Creative Commons.

Keep an eye open for the ITU gaze interaction platform that will be released in conjunction with CHI09 in early April.

Wednesday, January 21, 2009

Wearable EOG Goggles: Eye-Based Interaction in Everyday Environments

Andreas Bulling in the Wearable Computing Group at the Swiss Federal Insitute of Technology (ETH) is working on a new Electrooculography-based eye tracking system. This technology relies on the small but measurable electrical currents (potentials) created by the eye musculature. A set of electrodes are attached to the skin and after signal processing this data can be used for controlling computer interfaces or other devices. The obvious advantage of this method of eye tracking compared to the more traditional corneal reflection video-based methods is that its not sensitive to sunlight and may therefor be used outdoors. However, to my knowledge, it provide a lower accuracy, this results in most EOG interfaces relying on eye gestures rather than gaze fixations.

"We want to introduce the paradigm of visual perception and investigations on eye movements as new methods to implement novel and complement current context-aware systems. Therefore, we will investigate the potential but also possible limitations of using eye movements to perform context and activity recognition in wearable settings. Besides recognizing individual activities another focus will be put on long-term eye movement analysis." More information.

Recently Andreas got a paper accepted for the CHI 2009 conference in Boston (April 4-9th) where the system will be demonstrated during the interactivity session. Andreas and the team at ETH are planning to investigate attentive user interfaces (AUI) in mobile settings using wearable systems, such as the prototype demonstrated in the video below.

View on YouTube

SMI gets the International Forum Design Award

Congratulations to the guys at SensoMotoric Instruments (SMI) for winning the International Forum 2009 Product Design Award with their iView X™ RED eye tracker.

"The unobtrusive yet elegant design for the stand-alone as well as for the monitor-attached configuration of the eye tracking system convinced the jury. "

The award will be presented at the first day of CeBIT (3rd of March) in Hanover. The system will also be on display for those of you who are attending CeBIT. More information on the International Forum Award.

Saturday, January 3, 2009

The Argentinian Eye Mouse software released (Amaro & Ponieman)

Nicolás Amaro and Nicolás Ponieman at the ORT Argentina, recently got the Chamber of Industry and Trade Argentine-German Award for Innovation 2008 for their work on a low-cost (webcam) headmounted corneal reflection based solution. Best of all the software can be downloaded which will directly benefit those who are in need but cannot afford the state-of-the-art systems currently on the market. As demonstrated by the video below it is capable of running grid-based interfaces, thus it should be adequate for GazeTalk and similar.

View on YouTube