Wednesday, November 9, 2011

Low Cost Eye Tracking for Commercial Gaming: eyeAsteroids and eyeShoot in the Dark!

Latest update from Stephen Vickers and Howell Istance at the Centre for Computational Intelligence, De Montfort University who have been doing research and development of gaze-based gaming for several years now. Their latest project has been shortlisted in the consumer category of The Enginner Awards 2011. Congratulations on an awesome job!

"Using your eyes and where you are looking to interact with computer games represents an exciting new direction that game play can take, following the success of whole body interaction enabled by the Kinect and the Wii. The Innovation Fellowship has supported the development and demonstration of a low-cost eye tracker by De Montfort University, in collaboration with Sleepy Dog, the East Midlands games company that produced the Buzz-it controller and games. The low-cost eye tracker utilised the ITU Gazetracking library and was produced as a fully working pre-production prototype. In the project, three different games were produced to demonstrate different ways in which eye gaze can be used to make game play more immersive and exciting.

This video demostrates two of them.
  • eyeAsteroids The ship flies towards where you are looking and the space bar is used to fire.
  • eyeShoot in the Dark! The torch shines at where you are looking and the mouse is used to move the cross-hair and fire. 


Lizzie Maughan said...

Brilliant PR but eye control is of little practical utility for most people in most situations as I explained in some detail in this blog:
A $20 trackball is a better interface for Asteroids than an eye tracker. However, wearable eyetracking integrated with AR will be as revolutionary as the telephone was.

Unknown said...

Lizzie (Robert?)

Funny, but I saw the exact same message on LinkedIn, however it was related to the Tobii arcade.. this post is not about Tobii, it's about a research project. Duplicating your message makes me question your motive.

Eye tracking and gaze-based interaction is obviously very different from trackballs. What is the point of comparing apples to oranges?

How about demonstrating that "wearable eyetracking integrated with AR" that your talking about.

Big hat, no cattle?