Thursday, June 5, 2008

MedioVis at University of Konstanz

German student Simon Fäh uses Speech Recognition and Eye-Tracking to get a more fluid control of the visual information seeking system MedioVis. (video in German)



Tuesday, June 3, 2008

Eye typing at the Bauhaus University of Weimar

The Psychophysiology and Perception group, part of the faculty of Media at the Bauhaus University of Weimar are conducting research on gaze based text entry. Their past research projects include the Qwerty on-screen dwell based keyboard, IWrite, pEYEWrite and StarWrite. Thanks to Mario Urbina for notification.

QWRTY
"Qwerty is based on dwell time selection. Here the user has to stare for 500 ms a determinate character to select it. QWERTY served us, as comparison base line for the new eye typing systems. It was implemented in C++ using QT libraries."







IWrite
"A simple way to perform a selection based on saccadic movement is to select an item by looking at it and confirm its selection by gazing towards a defined place or item. Iwrite is based on screen buttons. We implemented an outer frame as screen button. That is to say, characters are selected by gazing towards the outer frame of the application. This lets the text window in the middle of the screen for comfortable and safe text review. The order of the characters, parallel to the display borders, should reduce errors like the unintentional selection of items situated in the path as one moves across to the screen button.The strength of this interface lies on its simplicity of use. Additionally, it takes full advantage of the velocity of short saccade selection. Number and symbol entry mode was implemented for this editor in the lower frame. Iwrite was implemented in C++ using QT libraries."
PEYEWrite
"Pie menus have already been shown to be powerful menus for mouse or stylus control. They are two-dimensional, circular menus, containing menu items displayed as pie-formed slices. Finding a trade-off between user interfaces for novice and expert users is one of the main challenges in the design of an interface, especially in gaze control, as it is less conventional and utilized than input controlled by hand. One of the main advantages of pie menus is that interaction is very easy to learn. A pie menu presents items always in the same position, so users can match predetermined gestures with their corresponding actions. We therefore decided to transfer pie menus to gaze control and try it out for an eye typing approach. We designed the Pie menu for six items and two depth layers. With this configuration we can present (6 x 6) 36 items. The first layer contains groups of five letters ordered in pie slices.."

StarWrite
In StarWrite, selection is also based on saccadic movements to avoid dwell times. The idea of StarWrite is to combine eye typing movements with feedback. Users, mostly novices, tend to look to the text field after each selection to check what has been written. Here letters are typed by dragging them into the text field. This provides instantaneous visual feedback and should spare checking saccades towards text field. When a character is fixated, both it and its neighbors are highlighted and enlarged in order to facilitate the character selection. In order to use x- and y-coordinates for target selection, letters were arranged alphabetically on a half-circle in the upper part of the monitor. The text window appeared in the lower field. StarWrite provides a lower case as well, upper case, and numerical entry modes, that can be switched by fixating for 500 milliseconds the corresponding buttons, situated on the lower part of the application. There are also placed the space, delete and enter keys, which are driven by a 500 ms dwell time too. StarWrite was implemented in C++ using OpenGL libraries for the visualization."

Associated publications
  • Huckauf, A. and Urbina, M. H. 2008. Gazing with pEYEs: towards a universal input for various applications. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications (Savannah, Georgia, March 26 - 28, 2008). ETRA '08. ACM, New York, NY, 51-54. [URL] [PDF] [BIB]
  • Urbina, M. H. and Huckauf, A. 2007. Dwell time free eye typing approaches. In Proceedings of the 3rd Conference on Communication by Gaze Interaction - COGAIN 2007, September 2007, Leicester, UK, 65--70. Available online at http://www.cogain.org/cogain2007/COGAIN2007Proceedings.pdf [PDF] [BIB]
  • Huckauf, A. and Urbina, M. 2007. Gazing with pEYE: new concepts in eye typing. In Proceedings of the 4th Symposium on Applied Perception in Graphics and Visualization (Tubingen, Germany, July 25 - 27, 2007). APGV '07, vol. 253. ACM, New York, NY, 141-141. [URL] [PDF] [BIB]
  • Urbina, M. H. and Huckauf, A. 2007. pEYEdit: Gaze-based text entry via pie menus. In Conference Abstracts. 14th European Conference on Eye Movements ECEM2007. Kliegl, R. & Brenstein, R. (Eds.) (2007), 165-165.

Sunday, June 1, 2008

Project finalization and thesis defence

Last Thursday I held the defense for my masters thesis at the Cognitive Science department at Lund University. I wish to thank Dr. Björn Samuelsson for his excellent, and well beyond expected, review of the thesis. Björn comes from a background in Theoretical Physics / Complex Systems with an international recognition in Random Boolean Networks. I kindly appreciate his great detail of attention which raises the overall outcome of my project, hopefully to the level of international scientific publications.

I´ve put together a video demonstration that together with the thesis will be released this week. It demonstrates the U.I components and the prototype interface and will be posted to the major video sharing networks (YouTube etc.)

Thursday, May 22, 2008

GaCIT 2008 - Summer School on Gaze, Communication, and Interaction Technology

August 18-22, 2008, Tampere, Finland

"Vision and visual communication are central in HCI. Researchers need to understand visual information processing and be equipped with appropriate research tools and methods for understanding users' visual processes.

The GaCIT summer school offers an intensive one-week camp where doctoral students and researchers can learn and refresh vision related skills and knowledge under the tutelage of leading experts in the area. There will be an emphasis on the use of eye tracking technology as a research tool. The program will include theoretical lectures and hands-on exercises, an opportunity for participants to present their own work, and a social program enabling participants to exchange their experiences in a relaxing and inspiring atmosphere."

The following themes and speakers will be included:

  • Active Vision and Visual Cognition (Boris Velichkovsky)
    Topics will include issues in visual perception with relevant aspects of attention, memory, and communication.

  • Hands-on Eye Tracking: Working with Images and Video (Andrew Duchowski)
    Topics will include: Eye tracking methodology and experimental design review; setting up an image study: feedforward visual inspection training; visualization and analysis; exercise: between-subjects static fixation analysis; and analysis of eye tracked video.

  • Designing Eye-Gaze Interaction: Supporting Tasks and Interaction Techniques (Howell Istance)
    This part of the summer school will examine how user needs and tasks can be mapped on particular ways of designing gaze-based interaction. This will cover traditional desktop applications as well as 3D virtual communities and on-line games.

  • Eye tracking in Web Search Studies (Edward Cutrell)
    Topics will include: Eye tracking research on web search and more general web-based analysis with a brief hands-on session to try out the techniques.

  • Participant Presentations
    Time has been reserved for those participants that are doing or planning to do work related to the theme of the summer school to present and discuss their work or plans with the other participants.


Tuesday, May 20, 2008

SR Labs and I-MED Medical Console

From SR Labs in Italy comes the I-MED Medical Console. The interface contains some really nice interaction methods. The system is to be used in medical field by surgeons or medical staff alike whom can receive a benificial advantage of hands free interaction (viewing x-rays during surgery etc)






SR Labs have also developed the iABLE software suit which enables web browsing and sending emails. Unfortunally, there isn't much information is available in English. Seems to be the Italian alternative for MyTobii.




Sunday, May 18, 2008

Lund Eye-Tracking Academy (LETA) June11-13, 2008


SMI and Lund University Humanities Lab in Lund, Sweden, will realize the 1st Lund Eye Tracking Academy (LETA). Hosted by Kenneth Holmqvist and his team, the course will take place on June 11-13, 2008.

"We have decided to start our "eye-tracking academy" to help students, researchers and labs who start with eye-tracking to get up and running, to give them a flying start. Although eye-tracking research can be both fascinating and extremely useful, doing good eye-tracking research requires a certain minimum knowledge. Without having a basic understanding of what can be done with an eye-tracker, how to design an experiment and how to analyse the data, the whole study runs of risk of just producing a lot data which cannot really answer any questions.

Currently, to our knowledge, there exists no recurring and open course in eye-tracking in Europe outside Lund and everybody is trained on a person-to-person level in the existing larger research labs. In Lund, we already hold a 10 week, 7.5 ECTS Master-level course in eye-tracking. Now we offer another course, an intensive 3 day course, open to all interested in eye-tracking, who want to get a flying start and acquire some basic understanding of how to run a scientifically sound eye-tracking experiment and get high quality data that they know how to analyze.

This training course is open for all researchers and investigators just before or in the early phases of using eye-tracking, and for users wanting to refresh their knowledge of their system. It is open for attendees from universities as well as from industry. As part of the course, attendees will work hands-on with sample experiments, from experimental design to data analysis. Participants will train on state-of-the-art SMI eye-tracking systems, but the course is largely hardware independent and open to users of other systems."

Scheduled dates
11-13 June, 2008. Starting at 09:00 in the morning and ending the last day at around 16:00.

Course contents
• Pro and cons of headmounted, remote and contact eye-trackers.
• High sampling speed and detailed precision – who needs it?
• Gaze-overlaid videos vs datafiles – what can you do with them?
• How to set up and calibrate on a variety of subjects on different eye-trackers?
• Glasses, lenses, mascara, and drooping eye-lids – what to do?
• How to work with stimulus programs, and synchronize them with eye-tracking recording?
• How to deal with the consent forms and ethical issues?
• Short introduction to experimental design: Potentials and pitfalls.
• Visualisation of data vs number crunching.
• Fast data analysis of multi-user experiments.
• Fixation durations, saccadic amplitudes, transition diagrams, group similarity measures, and all the other measures – what do they tell us? What are the pitfalls?

Teaching methods
Lectures on selected topics (8h)
Hands-on work in our lab on prespecified experiments: Receiving and recording on a subject (9h). Handson initial data analysis (3h).

Eye-tracking systems available for this training
2*SMI HED 50 Hz with Polhemus Head-tracking
3*SMI HiSpeed 240/1250 Hz
SMI RED-X remote 50 Hz
2*SMI HED-mobile 50/200 Hz

Attendance fee
€750 incl course material, diploma and lunches if you register before June 5th.
We will only run the course if we get enough participants. Register online.

The course content is equivalent to 1 ECTS credit at Master's level or above, although we cannot currently provide official registration at Lund University for this credit.

SWAET 2008 videos online

The recorded videos from the presentations at the SWAET2008 conference in Lund a couple of weeks ago is now online.
Check out the complete list of presentations at the Lund University Humanities lab website.

Wednesday, May 14, 2008

Modify you webcam to capture infrared light

Found this a while back, might be useful if building a home-brew webcam eye tracker..
The most common method of eye tracking today is to use IR light to create corneal reflections. To do this you need a pair of IR LED´s (perhaps a couple of these) and a camera that is capable of receiving that spectrum of light. More or less all cameras can do this but manufacturers usually place a filter in front of the sensor so that it only picks up the perceivable spectrum of light. By removing this and replacing it with a filter that only passes IR you get a camera suitable for corneal reflection based eye tracking.



















Next step would probably be to locate the head and the eyes in the video sequence. A good starting point for this is the OpenCV library. Martin Wimmer has created the EyeFinder system which locates head and eyes in real-time. Upon locating the eyes an area is defined as a region of interest which is passed on to image processing algorithms. These will typically locates the pupil by ellipse fitting and the glints/reflections that the IR lights create and somewhere along those lines it becomes "easier said than done" =) (I have yet to try building my own =)

Tuesday, May 13, 2008

IUI Gaze interaction tutorial (2005)

Back in 2005 Kari-Jouko Räihä, Aulikki Hyrskykari, Päivi Majaranta at the Unit for Human-Computer Interaction of the University of Tampere held a tutorial session on gaze based interaction at the Intelligent User Interfaces (IUI) conference. Luckily they posted the info online with a nice reference list. A good starting point for gaze interaction, especially in combination with the COGAIN bibliography. I will follow this move and publish a list of all the papers I´ve been reading while writing my thesis (which is soon to be finalized btw)

Inspiration: Gaze-Based Interaction for Semi-Automatic Photo Cropping

From the Visualization Lab, a part of the Computer Science Division at UC Berkley, comes an application that records gaze position while viewing images and then uses the data to perform cropping in a more intelligent way.

Abstract
"We present an interactive method for cropping photographs given minimal information about the location of important content, provided by eye tracking. Cropping is formulated in a general optimization framework that facilitates adding new composition rules, as well as adapting the system to particular applications. Our system uses fixation data to identify important content and compute the best crop for any given aspect ratio or size, enabling applications such as automatic snapshot recomposition, adaptive documents, and thumbnailing. We validate our approach with studies in which users compare our crops to ones produced by hand and by a completely automatic approach. Experiments show that viewers prefer our gaze-based crops to uncropped images and fully automatic crops."

Original well-composed images (left), adapted to two different aspect ratios using our gaze-based approach. An ADL document (right) using our crops. If eye movements are collected passively during document construction, our approach allows adaptation of images to arbitrary aspect ratios with no explicit user effort.

Associated paper:
Also check out