Friday, April 17, 2009

IDG Interview with Javier San Agustin

During the CHI09 in Boston last week Nick Barber from the IDG Network stopped by to record an interview with Javier San Agustin, member of the ITU GazeGroup. The video has now surfaced on several IDG sites around the world, clearly there is an interest for easy to use, low cost eye tracking. After the initial release of ITU Gaze Tracker we have setup a community forum at forum.gazegroup.org, with the ambition to connect users of open source eye tracking. If you like to be part of project, please join in promoting and developing an alternative. It´s open and accessible for all (platform documentation to be released in next week)

Hopefully, ideas and contributions to platform through the community makes the platform take off. Considering the initial release to be a Beta version, there are of course additional improvements to make. Additional cameras needs to be verified and bugs in code to be handled.

If you experience any issues or have ideas for improvements please post at http://forum.gazegroup.org



Computerworld.com.au

WebWereld.nl

PCAdvisor.co.uk

TechWorld.nl

IDG.no/ComputerWorld

ComputerWorld.dk

ComputerWorld.hu

ARNnet.com.au

Sunday, April 5, 2009

Introducing the ITU GazeTracker

The ITU Gaze Tracker is an open-source eye gaze tracking application that aims to provide a low-cost alternative to commercial gaze tracking systems and thereby making the technology more accessible. It is being developed by the Gaze Group at the IT University of Copenhagen, supported by the Communication by Gaze Interaction Association (COGAIN). The eye tracking software is video-based, and any camera equipped with infrared nightvision can be used, such as a video camera or a webcam. The cameras that have been tested with the system can be found in our forum.

Features:
  • Supports head mounted and remote setups
  • Tracks both pupil and glints
  • Supports a wide variety of camera devices
  • Configurable calibration
  • Eye-mouse capabilities
  • UDPServer broadcasting gaze data
  • Full source code provided


We encourage users and developers to test our software with their cameras and provide feedback so we can continue development. The ITU Gaze Tracker is released under the GLP3 open source license and the full source code is hosted at sourceforge. It´s written in C# using Emgu OpenCV wrapper for C++ image processing. (Microsoft .Net 3.5 needed) Once the tracker has been started it can be configured to broadcast gaze data via the UDP protocol which makes it easy to pick up in your own applications. We provide a sample implementation on a client in C#.

Open source eye tracking has never been easier. Download the binaries, plug the camera and launch the application. Adjust the sliders to match your camera and start the calibration.

Visit the ITU GazeGroup to download the software package. Please get in touch with us at http://forum.gazegroup.org

Tuesday, March 31, 2009

Radio interview with DR1

Thomas Behrndtz from the Danish Radio (DR1) came by the other day to do an interview on the upcoming ITU Gaze Interaction platform. It resulted in a five minute episode on the "Videnskaben kort", a radio program on interesting progress in science. Lately we have been working hard on the software package which is to be released at CHI09 in Boston next week. It includes a number of applications and tools that are to be released for free download including source code under the GPL licence. In short, these are exciting times for low-cost eye tracking and gaze interaction. Stay tuned..

Click on image to hear the radio interview (in Danish/Swedish)

Thursday, March 12, 2009

The Argentinian myEye released

I have been following an interesting project taking place in Argentina during the last half year. Marcelo Laginestra have through his blog described the developments of a low-cost webcam based eye tracker. It has now been released for download, free of charge.

The system requirements are modest,
  • CPU: 1.5 Ghz or higher
  • RAM: 256 DDR RAM or higher (Recommendation 512 RAM)
  • Space: at least 100MB hard disk space.
  • Camera: 640x480 capture resolution or higher. (At least 30fps)
  • O.S.: Microsoft Windows XP SP2
Go to the myEye website to download the software.

I am happy to see that the project came through, kudos for releasing under Creative Commons.

Keep an eye open for the ITU gaze interaction platform that will be released in conjunction with CHI09 in early April.

Monday, February 16, 2009

ID-U Biometrics: Eye movement based access control

Daphna Palti-Wasserman and Yoram Wasserman at ID-U Biometrics have developed a system which provides secure signatures to access control based on individual eye movement patterns. The subject’s response to a dynamic stimuli provides an unique characteristics. As the stimuli will change the subjects’ responses will be different each time but the pattern of eye movements and the users eye characteristics will remains the same. This results in a "code" which is not entered and not consciously controlled by the user which reduces issues of spoofing. Currently its in a proof-of-concept state, achieving a 100% accurate and stable eye tracking method which would be required for identification has yet to be achieved (by any eye tracking platform that is) However, this method of user identification could be applied in other situations than the ATM (I guess that's why they won the GLOBES start-up competition)

Links:

Tuesday, February 10, 2009

COGAIN 2009 (26th May) "Gaze interaction for those who want it most".

"The 5th international COGAIN conference on eye gaze interaction emphasises user needs and future applications of eye tracking technology. Robust gaze interaction methods have been available for some years, with substantial amounts of applications to support communication, learning and entertainment already being used. However, there are still some uncertainties about this new technology among communication specialists and funding institutions. The 5th COGAIN conference will focus on spreading the experiences of people using gaze interaction in their daily life to potential users and specialists who have yet to benefit from it. Case studies from researchers and manufacturers working on new ways of making gaze interaction available for all, as well as integrating eye gaze with other forms of communication technology are also particularly welcome. We also encourage papers and posters which reach beyond the special case of eye control for people with disabilities into mainstream human-computer interaction development, for instance using eye tracking technology to enhance gaming experience and strategic play."

Themes:

  • Gaze-based access to computer applications
  • Gaze and environmental control
  • Gaze and personal mobility control
  • User experience studies
  • Innovations in eyetracking systems
  • Low cost gaze tracking systems
  • Attentive interfaces and inferring user intent from gaze
  • Gaze-based interaction with virtual worlds
  • Gaze and creativity
  • Gaming using gaze as an input modality
  • Gaze interaction with wearable displays
  • Using gaze with other modalities including BCI

"Papers which deal with the use of eye gaze to study the usability of mainstream applications and websites are not normally considered for inclusion in the conference". For more information see the COGAIN 2009 Call for Papers

Important dates:

Paper submission, 28th February. Notification on acceptance, 15th April. The conference will be held on the 26th of May at the Danish Technical University in connection with the Visionday event.

Friday, January 30, 2009

SWAET 2009 Annouced

The Scandinavian Workshop of Applied Eye-Tracking aims at being a meeting place for graduate students, researchers and others using eye-tracking as a measuring tool. It will be held at the University of Stavanger (May 6- 7th). Keynote speakers at SWAET 2009 are Dr Benjamin Tatler (University of Dundee) and Prof Jukka Hyönä (University of Turku).

Suggested topics for workshop presentations:
  • Reading in various contexts
  • Psycholinguistics
  • Integration of pictures and language
  • Face-to-face interaction and other social contexts
  • Attention (such as top-down/bottom-up factors)
  • Controlling interfaces with eye-tracking
  • Viewer behaviour towards images and video
  • Vehicle and traffic research
  • Human factors; such as air traffic control, ship navigation and pilots
  • Evaluation of user interfaces
  • Cognitive processes such as navigation, planning, problem solving, mental imagery, memory etc.
If you wish to present your research, you have to submit an abstract no later than March 15th 2009. Decisions on acceptance are given on April 1st.

Registration at the conference is € 50 for all delegates except graduate and undergraduate students, who participate free of charge. After April 10th, expect to pay € 80 (students € 30).

Wednesday, January 21, 2009

Wearable EOG Goggles: Eye-Based Interaction in Everyday Environments

Andreas Bulling in the Wearable Computing Group at the Swiss Federal Insitute of Technology (ETH) is working on a new Electrooculography-based eye tracking system. This technology relies on the small but measurable electrical currents (potentials) created by the eye musculature. A set of electrodes are attached to the skin and after signal processing this data can be used for controlling computer interfaces or other devices. The obvious advantage of this method of eye tracking compared to the more traditional corneal reflection video-based methods is that its not sensitive to sunlight and may therefor be used outdoors. However, to my knowledge, it provide a lower accuracy, this results in most EOG interfaces relying on eye gestures rather than gaze fixations.

"We want to introduce the paradigm of visual perception and investigations on eye movements as new methods to implement novel and complement current context-aware systems. Therefore, we will investigate the potential but also possible limitations of using eye movements to perform context and activity recognition in wearable settings. Besides recognizing individual activities another focus will be put on long-term eye movement analysis." More information.

Recently Andreas got a paper accepted for the CHI 2009 conference in Boston (April 4-9th) where the system will be demonstrated during the interactivity session. Andreas and the team at ETH are planning to investigate attentive user interfaces (AUI) in mobile settings using wearable systems, such as the prototype demonstrated in the video below.

View on YouTube

SMI gets the International Forum Design Award

Congratulations to the guys at SensoMotoric Instruments (SMI) for winning the International Forum 2009 Product Design Award with their iView X™ RED eye tracker.

"The unobtrusive yet elegant design for the stand-alone as well as for the monitor-attached configuration of the eye tracking system convinced the jury. "

The award will be presented at the first day of CeBIT (3rd of March) in Hanover. The system will also be on display for those of you who are attending CeBIT. More information on the International Forum Award.

Saturday, January 3, 2009

The Argentinian Eye Mouse software released (Amaro & Ponieman)

Nicolás Amaro and Nicolás Ponieman at the ORT Argentina, recently got the Chamber of Industry and Trade Argentine-German Award for Innovation 2008 for their work on a low-cost (webcam) headmounted corneal reflection based solution. Best of all the software can be downloaded which will directly benefit those who are in need but cannot afford the state-of-the-art systems currently on the market. As demonstrated by the video below it is capable of running grid-based interfaces, thus it should be adequate for GazeTalk and similar.

View on YouTube