Monday, May 12, 2008

SR Research: EyeLink Remote

"The EyeLink Remote system is designed for areas of eye tracking research where a head rest or head mount is not desirable but high accuracy and resolution are still important.

Head distance is accurately measured at 500 Hz using a small target sticker placed on the participants’ forehead. The use of a distance measure that is independent of the eye allows for head position to be tracked even during blinks, providing an extremely fast 2 msec blink recovery delay. The fast head position sampling rate also allows for head tracking during very quick head movements. (website)

Click image for a video demonstration

Rather impressive 500 Hz sampling rate which enables saccade detection algorithms and fast head movements. The recovery time from blinks is specified at only 2 milliseconds. However, the target sticker on the forehead is a workaround from actual face tracking. OK in the lab but not feasible in everyday, real world gaze interaction scenarios. (besides that my guess is that this tracker does not come cheap due to the high speed cameras used)

Gaze beats mouse: a case study.

Michael Dorr (publications) at the Institue of Neuro and Bioinformatics of University of Lübeck modified the open source version of LBreakout to be driven by gaze input using an SMI eye tracker. This was then used in a multiplayer setup where gaze/eye tracking took on the mouse. Source code and more information can be found here.

Click on images for video demonstration.


Related paper:

  • Michael Dorr, Martin Böhme, Thomas Martinetz, and Erhardt Barth. Gaze beats mouse: a case study. In The 3rd Conference on Communication by Gaze Interaction - COGAIN 2007, Leicester, UK, pages 16-19, 2007. [ bib .pdf ]

Sunday, May 11, 2008

Inspiration: Gaze controlled web browser

Craig Hennesy a Ph.D candidate at the Electrical and Computer Engineering dept. at University of British Columbia have done some work on gaze interaction. This includes the use of gaze position to scroll documents as the reader approaches the bottom of the window. This is used in other applications as well (really useful feature).

I´m currently working on my own implementation which will provide this functionality for a wide range of documents (web, pdf, word etc.) in combination with some new approaches on navigating the web using gaze alone.




"This video illustrates the use of eye-gaze tracking integrated with web browsing. The goal of this application is to reduce the use of the mouse when reading by removing the need to scroll up or down with the mouse. The simple scrolling application allows you to scroll down by looking below the article, scroll up by looking above the article, and go Back and Forward in the browser by looking to the left and right respectively.

In this demo the eye is tied to the mouse cursor so you can see where the user is looking, in the real application the motion of the eye stays behind the scenes and the mouse functions as a normal computer mouse."

Tuesday, May 6, 2008

Gaze interaction hits mainstream news

The New Scientist technology section posted an article on the Stephen Vickers work at De Montford University for the eye controlled version of World of Warcraft which I wrote about two months ago (see post)
Update: The New Scientist post caused rather extensive discussions on Slashdot, with more than
140 entries.

Great to see mainstream interest of gaze driven interaction. Gaming is truly one area where there is a huge potential, but it also depends on more accessible eye trackers. There is a movement for open source based eye tracking but the robustness for everyday usage is still remains at large. The system Stephen Vickers have developed is using the Tobii X120 eye tracker which is clearly out of range for all but the small group of users whom are granted financial support for their much needed assistive technology.

Have faith
In general, all new technology initially comes at a high cost due to intensive research and development but over time becomes accessible for the larger population. As an example, not many could imagine that satellite GPS navigation would be commonplace and really cheap a decade or two ago. Today mass-collaboration on the net is really happening making the rate of technology development exponential. Make sure to watch Google Techtalk Don Tapscott on Wikinomics.

Saturday, May 3, 2008

Interface evaluation procedure

The first evaluation of the prototype is now completed. The procedure was designed to test the individual components as well as the whole prototype (for playing music etc.). Raw data on selection times, error rates etc. was collected by custom development on the Microsoft .NET platform. Additional forms were displayed on-screen in between the steps of the procedure combined with standardized form based questionnaires to gather a rich set of data.

A more subjective approach was taken during the usage of the prototype as a whole to capture the aspects which could not be confined by automatic data collection or forms. Simply by observing participants and asking simple questions in normal spoken language. While perhaps being less scientifically valid this type of information is very valuable for understanding how the users think and react. Information that is crucial for improving the design for the next iteration. There is sometimes a difference the results gained by verbal utterance, questionnaires and measurable performance. For example, interfaces can be very efficient and fast but at the same time extremely demanding and stressful. Just measuring one performance factor would not tell the whole story.

This is why I chosen to use several methods to combine raw performance data, form based questionnaires and unconstrained verbal interviewing. Hoping that it can provide multiple aspects with potential for gathering a rich source of data.

For the evaluation I used a basic on-screen demographic form gathering age, sex, computer experience, conditions of vision etc. In-between the evaluation of the individual components I used the NASA Task Load Index as a quick reflection form and at the end of the session I handed out both a IBM Computer Usability Satisfaction Questionnaire and a Q.U.I.S Generic User Interface Questionnaire. The only modification I performed was to remove a few questions that would not apply to my prototype (why ask about help pages when the prototype contains none)

I´ve found the 230 Tips and Tricks for Better Usability Testing guide to be really useful and should be read by anyone conduction HCI evaluation.

Questionnaires used in my evaluation, in PDF format:

Monday, April 28, 2008

SWAET Conference

The first day of the SWAET 2008 conference at Lund University was filled with interesting presentations and inspiring conversations. Finally I had a chance to meet some of the researchers that I´ve know mainly by their publications and last names.

From the schedule of day one three talks stand out in the field of gaze interaction. These are recorded on video with the intention of future online distribution. For now enjoy these papers.
The conference was attended by three manufacturers, SMI, SmartEye and Tobii which all had systems on display.

The SMI system demonstrated was brought up from the dark dungeons of the lab to host the prototype I have been working on for the last couple of months. In general, it was well received and served its purpose of a eye catching demonstration of my "next-gen" gaze interface. To sum up it was great to get out there to gather feedback confirming that I´m on the right track. (doubts sometimes rise when working solo) The day ended with a a lovely dinner at the university´s finest dining hall.

Big thank you goes out to Kenneth Holmqvist, Jana Holsanova, Philip Diderichsen, Nils Holmberg, Richard Andersson and Janna Spanne for hosting this event.

Monday, April 21, 2008

Open invitation to participate in the evaluation

The invitation to participate in the evaluation of the prototype is out. If you have the possibility to participate I would be most thankful for your time. The entire test takes roughly 30 minutes.


The invitation can be downloaded as pdf.

If you wish to participate send me a message. Thank you.

More information on the procedure and structure of the evaluation as well as gained experience will be posted once the testing is completed. The final results will be published in my master thesis by the end of May. Stay tuned.

Wednesday, April 16, 2008

Alea Technologies IntelliGaze demo

A demonstration of the IG30Pro remote eye tracker used to play a first person shooter game. Notice the algorithms used to stabilize the pointer and filtering out the jitter usually associated with eye trackers.


Tuesday, April 15, 2008

Alea Technologies

The German firm Alea Technologies offers a solution called IntelliGaze consisting of a remote based eye tracking system as well as a software suite. The system is designed to be a flexible solution in terms of both hardware and software. The eye tracker is stand-alone and can be used to create a customizable setup when it is combined with a various displays. Additionally, they provide API´s for application developments.

A clear usage for their technology is users with disabilities such as ALS. The software contains a "desktop" system which acts as a launcher for other applications (Windows natives, Grid, Cogain). In general, they seem to target Tobii Technologies who have been very successful with their MyTobii application running on the P10 eye tracker. The game is on.


Quote:
"The ease of setup and intuitive operation bring a completely new degree of freedom to patients who had to rely on older technologies like manual scanning or cumbersome pointing devices. By using the latest camera technology and very sophisticated image processing and calibration methods, the IG-30 system is far superior to alternative gaze input systems at a similar price point. The modular structure of the system allows the patients with a degenerative disease the continued use of their touch-screen or scanning software package. Using existing computers and monitors, the IG-30 system can also be easily integrated into an existing PC setup. The economic pricing of the IntelliGazeTM systems opens this high-tech solution to a much wider range of users."

Tracking technology
Hybrid infrared video eye- & head-tracking Binocular & monocular tracking
Working Volume
centered at 600 mm distance
300 x 200 x 200 mm3 [WxHxD]
Accuracy , static 0.5°, typical
Accuracy , over full working volume 1°, typical
Sampling Rate 50 Hz
Max. head-movement velocity 15 cm/s
Recovery-time after tracking loss
(head was too fast or moved out of range)
40 ms
System Dimensions ca. 300 x 45 x 80 mm3 [WxHxD]
Mounting Options on monitor via VESA-adapter
on Tablet-PC via customized interfaces
System Weight ca. 1,2 kg.

Gaze Interaction Demo (Powerwall@Konstanz Uni.)

During the last few years quite a few wall sized displays have been used for novel interaction methods. Not seldomly these have been used with multi-touch, such as the Jeff Han´s FTIR technology. This is the first demonstration I have seen where eye tracking is used for a similar purpose. A German Ph.D candidate, Jo Bieg, is working on this out of the HCI department at the University of Konstanz. The Powerwall is 5.20 x 2.15M and has a resolution of 4640 x 1920.



The demonstration can be view at a better quality (10Mb)

Also make sure to check out the 360 deg. Globorama display demonstration. It does not use eye tracking for interaction but a laser pointer. Nevertheless, really cool immersive experience, especially the Google Earth zoom in to 360 panoramas.