Thursday, February 16, 2012

Eyewriter & Not Impossible Foundation

The Eyewriter project which helped Tony 'TemptOne' Quan to draw again was originally document by Mick Ebeling. This material has been incorporated into a documentary called "Getting up" and recently won the audience award at the Slamdance. Movie buff Christopher Campbell wrote a short review on his blog. Great job on raising awareness, hope you guys find funding to further develop the software.



Getting Up: The Tempt One Story Trailer




How to build an EyeWriter

Wednesday, February 15, 2012

Prelude for ETRA2012

The program for the Eye Tracking Research & Applications (ETRA'12) is out and contains several really interesting papers this year.

Two supplementary videos surfaced the other day and comes from the User Interface & Software Engineering group at the Otto-von-Guericke-Universität in Germany. In addition the authors, Sophie Stellmach and Raimund Dachselt, have a paper submitted for the ACM SIGCHI Conference on Human Factors in Computing Systems" (CHI'12). Abstracts and videos below.

Abstract I (ETRA)
Remote pan-and-zoom control for the exploration of large information spaces is of interest for various application areas, such as browsing through medical data in sterile environments or investigating geographic information systems on a distant display. In this context, considering a user's visual attention for pan-and-zoom operations could be of interest. In this paper, we investigate the potential of gaze-supported panning in combination with different zooming modalities: (1) a mouse scroll wheel, (2) tilting a handheld device, and (3) touch gestures on a smartphone. Thereby, it is possible to zoom in at a location a user currently looks at (i.e., gaze-directed pivot zoom). These techniques have been tested with Google Earth by ten participants in a user study. While participants were fastest with the already familiar mouse-only base condition, the user feedback indicates a particularly high potential of the gaze-supported pivot zooming in combination with a scroll wheel or touch gesture.


 
To be presented at the ETRA12.


Abstract II
Since eye gaze may serve as an efficient and natural input for steering in virtual 3D scenes, we investigate the design of eye gaze steering user interfaces (UIs) in this paper. We discuss design considerations and propose design alternatives based on two selected steering approaches differing in input condition (discrete vs. continuous) and velocity selection (constant vs. gradient-based). The proposed UIs have been iteratively advanced based on two user studies with twelve participants each. In particular, the combination of continuous and gradient-based input shows a high potential, because it allows for gradually changing the moving speed and direction depending on a user's point-of-regard. This has the advantage of reducing overshooting problems and dwell-time activations. We also investigate discrete constant input for which virtual buttons are toggled using gaze dwelling. As an alternative, we propose the Sticky Gaze Pointer as a more flexible way of discrete input.


To be presented at the ETRA12.


Abstract III (CHI)
While eye tracking has a high potential for fast selection tasks, it is often regarded as error-prone and unnatural, especially for gaze-only interaction. To improve on that, we propose gaze-supported interaction as a more natural and effective way combining a user's gaze with touch input from a handheld device. In particular, we contribute a set of novel and practical gaze-supported selection techniques for distant displays. Designed according to the principle gaze suggests, touch confirms they include an enhanced gaze-directed cursor, local zoom lenses and more elaborated techniques utilizing manual fine positioning of the cursor via touch. In a comprehensive user study with 24 participants, we investigated the potential of these techniques for different target sizes and distances. All novel techniques outperformed a simple gaze-directed cursor and showed individual advantages. In particular those techniques using touch for fine cursor adjustments (MAGIC touch) and for cycling through a list of possible close-to-gaze targets (MAGIC tab) demonstrated a high overall performance and usability.


 
To be presented at the CHI12.

Tuesday, January 10, 2012

EyeTech EyeOn

A video from EyeTech that features Michael who suffers from Thoracic Outlet Syndrome (TOS). Great little clip that shows what computer control without gaze-adapted interfaces comes down to. Luckily Michael can use voice recognition software for typing, text input using eye movements alone is a cumbersome process (source).


Monday, November 14, 2011

EyeDrone - Eye gaze controlled navigation



Demonstration for eye-gaze controlled navigation. The goal was to move the the point of gaze to the center of the screen, simply put "where you look is where it goes". This is the popular AR.Drone quadricopter controlled with this policy using an EyeLink 2000 eye-tracker. The drone pivots left and right when the operator looks into those directions. It tilts up and down (making it fly backwards and forwards) when the operator looks up or down. EyeDrone was implemented by Lucas C Parra in C++ with much help from Michael Quintian. The chin rest here is not really needed as the basic control algorithm is inherently stable and miscalibrations are of little concern. 

Learn more at:http://bme.ccny.cuny.edu/faculty/lparra/eyeNavigate/index.html
Explore the Neural Engineering Group at CCNY - http://neuralengr.com

Wednesday, November 9, 2011

Low Cost Eye Tracking for Commercial Gaming: eyeAsteroids and eyeShoot in the Dark!

Latest update from Stephen Vickers and Howell Istance at the Centre for Computational Intelligence, De Montfort University who have been doing research and development of gaze-based gaming for several years now. Their latest project has been shortlisted in the consumer category of The Enginner Awards 2011. Congratulations on an awesome job!

"Using your eyes and where you are looking to interact with computer games represents an exciting new direction that game play can take, following the success of whole body interaction enabled by the Kinect and the Wii. The Innovation Fellowship has supported the development and demonstration of a low-cost eye tracker by De Montfort University, in collaboration with Sleepy Dog, the East Midlands games company that produced the Buzz-it controller and games. The low-cost eye tracker utilised the ITU Gazetracking library and was produced as a fully working pre-production prototype. In the project, three different games were produced to demonstrate different ways in which eye gaze can be used to make game play more immersive and exciting.

This video demostrates two of them.
  • eyeAsteroids The ship flies towards where you are looking and the space bar is used to fire.
  • eyeShoot in the Dark! The torch shines at where you are looking and the mouse is used to move the cross-hair and fire. 

Monday, September 19, 2011

SMI Glasses now available

The SMI Glasses, which we got a sneak-peak at before last Christmas and specs in April are now available for sale. SMI engineers have managed to squeeze two 30Hz eye tracking cameras and a high definition scene camera into the frame weighting only 75 grams. The advantage of tracking both eyes is that it enables parallax compensation where the accuracy is maintained regardless if you look close or far in the scene. In addition, binocular tracking most often yields better accuracy as it provides the opportunity to average both samples or to discard one with low validity. The HD scene camera captures 24 frames per second at 1280x960 pixels, a clear advantage. All in all, better late than never, SMI is back with a highly competitive product in the market. 

Wednesday, August 31, 2011

ETRA 2012 - Call for papers

"The Seventh ACM Symposium on Eye Tracking Research & Applications (ETRA 2012) will be held in Santa Barbara, California on March 28th-30th, 2012. The ETRA conference series focuses on all aspects of eye movement research and applications across a wide range of disciplines.  The symposium presents research that advances the state-of-the-art in these areas, leading to new capabilities in gaze tracking systems, gaze aware applications, gaze based interaction, eye movement data analysis, etc. For ETRA 2012, we invite papers in all areas of eye tracking research and applications."

ETRA 2012 THEME: MOBILE EYE TRACKING
     Mobile devices are becoming more powerful every day. Embedding eye tracking and gaze-based applications in mobile devices raises new challenges and opportunities to many aspects of eye tracking research. ETRA 2012 invites papers tackling the challenges, or exploring new research opportunities of mobile eye tracking.

GENERAL AREAS OF INTEREST
Eye Tracking Technology
    Advances in eye tracking hardware, software and algorithms such as: 2D and 3D eye tracking systems, calibration, low cost eye tracking, natural light eye tracking, predictive models, etc.

Eye Tracking Data Analysis
    Methods, procedures and analysis tools for processing raw gaze data as well as fixations and gaze patterns. Example topics are: scan path analysis, fixation detection algorithms, and visualization techniques of gaze data.

Visual Attention and Eye Movement Control
       Applied and experimental studies investigating visual attention and eye movements to gain insight in eye movement control, cognition and attention, or for design evaluation of visual stimuli. Examples are: usability and web studies using eye tracking, and eye movement behavior in everyday activities such as driving and reading.

Eye Tracking Applications
    Eye tracking as a human-computer input method, either as a replacement to traditional input methods or as a complement. Examples are: assistive technologies, gaze enhanced interaction and interfaces, multimodal interaction, gaze in augmented and mixed reality systems, and gaze-contingent displays.

SUBMISSIONS
Authors are invited to submit original work in the formats of Full paper (8 pages) and Short paper (4 pages). The papers will undergo a rigorous review process assessing the originality and quality of the work as well as the relevance for eye tracking research and applications.  Papers presented at ETRA 2012 will be available in the ACM digital library.  Submission formats and instructions are available at the conference web site.
IMPORTANT DATES
  Oct. 07th, 2011 Full Paper abstracts submission due
  Oct. 14th, 2011 Full Papers submission due 
  Nov. 21st, 2011  Full Papers acceptance notification 
  Dec. 07th, 2011 Short Papers submission due
  Jan. 16th, 2012 Short Papers acceptance notification
  Jan. 23rd, 2012 Camera ready papers due
 
CONFERENCE VENUE
ETRA 2012 will be held at the gorgeous Doubletree Resort in Santa Barbara, California, a 24-acre, mission-style resort hotel facing the Pacific Ocean, located on one of Southern California’s most beautiful coastlines.

SPONSORSHIP
ETRA 2012 is co-sponsored by the ACM Special Interest Group in Computer-Human Interaction (SIGCHI), and the ACM Special Interest Group on Computer Graphics and Interactive Techniques (SIGGRAPH). 

CONFERENCE CHAIRS
 Carlos Hitoshi Morimoto - University of São Paulo, Brazil
 Howell Istance - De Montfort University, UK

PROGRAM CHAIRS
Jeffrey B. Mulligan - NASA, USA
Pernilla Qvarfordt - FX Palo Alto, USA

PROGRAM AREA CHAIRS
       Andrew Duchowski, Clemson University, USA
       Päivi Majaranta, University of Tampere, Finland
       Joe Goldberg, Oracle, USA
       Shu-Chieh Wu, NASA Ames Research Center, USA
       Qiang Ji, Rensselaer Polytechnic Institute, USA
       Jeff Pelz, Rochester Institute of Technology, USA
       Moshe Eizenman, University of Toronto, USA

Tuesday, August 9, 2011

Job opening at Duke University: Software developer with research focus


Under the direction of the Chair of the Department of Radiology at Duke University and an international research group, the Software Engineer is responsible for developing and maintaining a research platform used to study visual perception. This research project utilizes eye trackers to capture gaze paths as the Radiologists search through medical image data sets. Your role involves rapid software development; as such you should be comfortable with swiftly assembling software on short iterations without formal requirements.

Tasks and Activities:

  • Rapid software development to support research activities without formal requirements.
  • Analyze design and architectural issues, and adjust existing system design and procedures to solve problems in a dynamic environment.
  • Solving a wide range of problems ranging from user interface design to more complex architectural design without supervision.
  • Readily accept responsibility and demonstrate ability to work independently.
  • Responsible for designing, developing, implementing, testing and maintaining software.
  • Regularly communicate project progress, issues, and risks to project manager.
  • Organizing datacollection with human subjects using eye tracking devices and the developed software platform.
Qualifications:

Required:
  • Strong technical knowledge and experience in the development, implementation and maintenance of an information system.
  • Aptitude to learn and understand change in software development process, procedures and methodologies.
  • Demonstrated experience with scientific methodology, academic writing and basic statistical analysis.
  • 5+ years of software development experience in object-orient programming technologies.
  • 2+ years of experience with Microsoft .Net C# and Windows Presentation Foundation.
  • Detailed technical knowledge and experience in use of data structures and network programming using TCP/IP and UDP.
Desired:
  • Experience with eye track systems or other forms of video-based tracking systems.
  • Experience from software engineering in a research setting.
  • Experience with the DICOM Medical Image format and visualization.
  • Experience designing or implementing algorithms for data analysis and image processing.
Education
MS+ in Computer Science or equivalent

Please visit this page to apply for the opening.

Wednesday, July 13, 2011

LG introduces the world first Glasses-Free 3D monitor with eye-tracking technology

Today LG announced a 20" LCD display with built-in "eye tracking" technology that enables glasses-free 3D imaging which moves this technology closer to the consumer market. The image below does, as far as I can tell, not reveal any infrared illuminators, a requirement for all known systems with high accuracy so it's probably more of a rough estimation system than a full-blown remote system. Best known accuracy (published research) under natural light is about 3-4 degrees of angle, with their financial resources they could potentially achieve better results. 
Left. The "special" eye tracking camera sensor. Looks like a rather typical webcam CMOS sensor to me. Unless they are doing some magic it will not allow accurate gaze estimation. Regardless, makes me wonder if 3D displays is the path by which eye tracking goes mainstream? Is this related to the collaboration between Seeing Machines and SuperD announced earlier this year or just a competing solution? Details are sparse, I'll keep you posted as it becomes available. 


Official press release:


SEOUL, July, 13, 2011 – LG Electronics (LG) today unveiled the world’s first glasses-free monitor utilizing eye-tracking technology to maintain an optimal 3D image from a range of viewing angles. The 20-inch D2000 (Korean model: DX2000) monitor was developed as a fully functional entertainment display capable of reproducing games, movies and images in all their realistic glory.

“With a full line-up of 3D TVs, laptops, projectors and smartphones, LG Electronics is by far and away the industry leader in all things 3D.” said Si-hwan Park, Vice President of the Monitor Division at LG’s Home Entertainment Company. “LG’s position has always been that 3D will and must eventually function without glasses. The D2000 is a look at what the future has in store.”

The D2000’s 3D effect comes courtesy of glasses-free parallax barrier 3D technology, and the application of the world’s first eye-tracking feature to the monitor. The combination of parallax barrier and eye-tracking in a single unit promises to open up new horizons for glasses-free 3D products.


Existing glasses-free 3D technologies generally require viewers to stay within a tightly restricted angle and distance to perceive the 3D images. However, the D2000 has done much to resolve this issue, allowing viewer much freer movement and more comfortable viewing. Eye tracking in the D2000 works via a special camera sensor attached to the monitor which detects changes in the user’s eye position in real-time. With this information, the monitor calculates the angle and position of the viewer and adjusts the displayed image for the optimal 3D effect.

In addition to playing back existing 3D content, the D2000 has a highly refined 2D to 3D conversion feature which adds a new dimension to existing movies and game playing.

The D2000, available in Korea this month, will be introduced in other markets around the world in the latter part of 2011.