Monday, December 12, 2011
Monday, November 14, 2011
EyeDrone - Eye gaze controlled navigation
Learn more at:http://bme.ccny.cuny.edu/faculty/lparra/eyeNavigate/index.html
Explore the Neural Engineering Group at CCNY - http://neuralengr.com
Wednesday, November 9, 2011
Low Cost Eye Tracking for Commercial Gaming: eyeAsteroids and eyeShoot in the Dark!
Latest update from Stephen Vickers and Howell Istance at the Centre for Computational Intelligence, De Montfort University who have been doing research and development of gaze-based gaming for several years now. Their latest project has been shortlisted in the consumer category of The Enginner Awards 2011. Congratulations on an awesome job!
"Using your eyes and where you are looking to interact with computer games represents an exciting new direction that game play can take, following the success of whole body interaction enabled by the Kinect and the Wii. The Innovation Fellowship has supported the development and demonstration of a low-cost eye tracker by De Montfort University, in collaboration with Sleepy Dog, the East Midlands games company that produced the Buzz-it controller and games. The low-cost eye tracker utilised the ITU Gazetracking library and was produced as a fully working pre-production prototype. In the project, three different games were produced to demonstrate different ways in which eye gaze can be used to make game play more immersive and exciting.
This video demostrates two of them.
"Using your eyes and where you are looking to interact with computer games represents an exciting new direction that game play can take, following the success of whole body interaction enabled by the Kinect and the Wii. The Innovation Fellowship has supported the development and demonstration of a low-cost eye tracker by De Montfort University, in collaboration with Sleepy Dog, the East Midlands games company that produced the Buzz-it controller and games. The low-cost eye tracker utilised the ITU Gazetracking library and was produced as a fully working pre-production prototype. In the project, three different games were produced to demonstrate different ways in which eye gaze can be used to make game play more immersive and exciting.
This video demostrates two of them.
- eyeAsteroids The ship flies towards where you are looking and the space bar is used to fire.
- eyeShoot in the Dark! The torch shines at where you are looking and the mouse is used to move the cross-hair and fire.
Monday, September 19, 2011
SMI Glasses now available
The SMI Glasses, which we got a sneak-peak at before last Christmas and specs in April are now available for sale. SMI engineers have managed to squeeze two 30Hz eye tracking cameras and a high definition scene camera into the frame weighting only 75 grams. The advantage of tracking both eyes is that it enables parallax compensation where the accuracy is maintained regardless if you look close or far in the scene. In addition, binocular tracking most often yields better accuracy as it provides the opportunity to average both samples or to discard one with low validity. The HD scene camera captures 24 frames per second at 1280x960 pixels, a clear advantage. All in all, better late than never, SMI is back with a highly competitive product in the market.
Wednesday, August 31, 2011
ETRA 2012 - Call for papers
"The Seventh
ACM Symposium on Eye Tracking Research & Applications (ETRA 2012)
will be held in Santa Barbara, California on March 28th-30th, 2012. The
ETRA conference series focuses on all aspects of eye movement research
and applications across a wide range of disciplines. The symposium
presents research that advances the state-of-the-art in these areas,
leading to new capabilities in gaze tracking systems, gaze aware
applications, gaze based interaction, eye movement data analysis, etc.
For ETRA 2012, we invite papers in all areas of eye tracking research
and applications."
ETRA 2012 THEME: MOBILE EYE TRACKING
Mobile devices are becoming more powerful every day. Embedding eye
tracking and gaze-based applications in mobile devices raises new
challenges and opportunities to many aspects of eye tracking research.
ETRA 2012 invites papers tackling the challenges, or exploring new
research opportunities of mobile eye tracking.
GENERAL AREAS OF INTEREST
Eye Tracking Technology
Advances in eye tracking hardware, software and algorithms such as:
2D and 3D eye tracking systems, calibration, low cost eye tracking,
natural light eye tracking, predictive models, etc.
Eye Tracking Data Analysis
Methods, procedures and analysis tools for processing raw gaze data as
well as fixations and gaze patterns. Example topics are: scan path
analysis, fixation detection algorithms, and visualization techniques of
gaze data.
Visual Attention and Eye Movement Control
Applied and experimental studies investigating visual attention
and eye movements to gain insight in eye movement control, cognition and
attention, or for design evaluation of visual stimuli. Examples are:
usability and web studies using eye tracking, and eye movement behavior
in everyday activities such as driving and reading.
Eye Tracking Applications
Eye
tracking as a human-computer input method, either as a replacement to
traditional input methods or as a complement. Examples are: assistive
technologies, gaze enhanced interaction and interfaces, multimodal
interaction, gaze in augmented and mixed reality systems, and
gaze-contingent displays.
SUBMISSIONS
Authors
are invited to submit original work in the formats of Full paper (8
pages) and Short paper (4 pages). The papers will undergo a rigorous
review process assessing the originality and quality of the work as well
as the relevance for eye tracking research and applications. Papers
presented at ETRA 2012 will be available in the ACM digital library.
Submission formats and instructions are available at the conference web
site.
IMPORTANT DATES
Oct. 07th, 2011 Full Paper abstracts submission due
Oct. 14th, 2011 Full Papers submission due
Nov. 21st, 2011 Full Papers acceptance notification
Dec. 07th, 2011 Short Papers submission due
Jan. 16th, 2012 Short Papers acceptance notification
Jan. 23rd, 2012 Camera ready papers due
CONFERENCE VENUE
ETRA
2012 will be held at the gorgeous Doubletree Resort in Santa Barbara,
California, a 24-acre, mission-style resort hotel facing the Pacific
Ocean, located on one of Southern California’s most beautiful
coastlines.
SPONSORSHIP
ETRA
2012 is co-sponsored by the ACM Special Interest Group in
Computer-Human Interaction (SIGCHI), and the ACM Special Interest Group
on Computer Graphics and Interactive Techniques (SIGGRAPH).
CONFERENCE CHAIRS
Carlos Hitoshi Morimoto - University of São Paulo, Brazil
Howell Istance - De Montfort University, UK
PROGRAM CHAIRS
Jeffrey B. Mulligan - NASA, USA
Pernilla Qvarfordt - FX Palo Alto, USA
PROGRAM AREA CHAIRS
Andrew Duchowski, Clemson University, USA
Päivi Majaranta, University of Tampere, Finland
Joe Goldberg, Oracle, USA
Shu-Chieh Wu, NASA Ames Research Center, USA
Qiang Ji, Rensselaer Polytechnic Institute, USA
Jeff Pelz, Rochester Institute of Technology, USA
Moshe Eizenman, University of Toronto, USA
Päivi Majaranta, University of Tampere, Finland
Joe Goldberg, Oracle, USA
Shu-Chieh Wu, NASA Ames Research Center, USA
Qiang Ji, Rensselaer Polytechnic Institute, USA
Jeff Pelz, Rochester Institute of Technology, USA
Moshe Eizenman, University of Toronto, USA
Tuesday, August 9, 2011
Wednesday, July 13, 2011
LG introduces the world first Glasses-Free 3D monitor with eye-tracking technology
Today LG announced a 20" LCD display with built-in "eye tracking" technology that enables glasses-free 3D imaging which moves this technology closer to the consumer market. The image below does, as far as I can tell, not reveal any infrared illuminators, a requirement for all known systems with high accuracy so it's probably more of a rough estimation system than a full-blown remote system. Best known accuracy (published research) under natural light is about 3-4 degrees of angle, with their financial resources they could potentially achieve better results.
Official press release:
SEOUL, July, 13, 2011 – LG Electronics (LG) today unveiled the world’s first glasses-free monitor utilizing eye-tracking technology to maintain an optimal 3D image from a range of viewing angles. The 20-inch D2000 (Korean model: DX2000) monitor was developed as a fully functional entertainment display capable of reproducing games, movies and images in all their realistic glory.
“With a full line-up of 3D TVs, laptops, projectors and smartphones, LG Electronics is by far and away the industry leader in all things 3D.” said Si-hwan Park, Vice President of the Monitor Division at LG’s Home Entertainment Company. “LG’s position has always been that 3D will and must eventually function without glasses. The D2000 is a look at what the future has in store.”
The D2000’s 3D effect comes courtesy of glasses-free parallax barrier 3D technology, and the application of the world’s first eye-tracking feature to the monitor. The combination of parallax barrier and eye-tracking in a single unit promises to open up new horizons for glasses-free 3D products.
Existing glasses-free 3D technologies generally require viewers to stay within a tightly restricted angle and distance to perceive the 3D images. However, the D2000 has done much to resolve this issue, allowing viewer much freer movement and more comfortable viewing. Eye tracking in the D2000 works via a special camera sensor attached to the monitor which detects changes in the user’s eye position in real-time. With this information, the monitor calculates the angle and position of the viewer and adjusts the displayed image for the optimal 3D effect.
In addition to playing back existing 3D content, the D2000 has a highly refined 2D to 3D conversion feature which adds a new dimension to existing movies and game playing.
The D2000, available in Korea this month, will be introduced in other markets around the world in the latter part of 2011.
Left. The "special" eye tracking camera sensor. Looks like a rather typical webcam CMOS sensor to me. Unless they are doing some magic it will not allow accurate gaze estimation. Regardless, makes me wonder if 3D displays is the path by which eye tracking goes mainstream? Is this related to the collaboration between Seeing Machines and SuperD announced earlier this year or just a competing solution? Details are sparse, I'll keep you posted as it becomes available.
Official press release:
SEOUL, July, 13, 2011 – LG Electronics (LG) today unveiled the world’s first glasses-free monitor utilizing eye-tracking technology to maintain an optimal 3D image from a range of viewing angles. The 20-inch D2000 (Korean model: DX2000) monitor was developed as a fully functional entertainment display capable of reproducing games, movies and images in all their realistic glory.
“With a full line-up of 3D TVs, laptops, projectors and smartphones, LG Electronics is by far and away the industry leader in all things 3D.” said Si-hwan Park, Vice President of the Monitor Division at LG’s Home Entertainment Company. “LG’s position has always been that 3D will and must eventually function without glasses. The D2000 is a look at what the future has in store.”
The D2000’s 3D effect comes courtesy of glasses-free parallax barrier 3D technology, and the application of the world’s first eye-tracking feature to the monitor. The combination of parallax barrier and eye-tracking in a single unit promises to open up new horizons for glasses-free 3D products.
Existing glasses-free 3D technologies generally require viewers to stay within a tightly restricted angle and distance to perceive the 3D images. However, the D2000 has done much to resolve this issue, allowing viewer much freer movement and more comfortable viewing. Eye tracking in the D2000 works via a special camera sensor attached to the monitor which detects changes in the user’s eye position in real-time. With this information, the monitor calculates the angle and position of the viewer and adjusts the displayed image for the optimal 3D effect.
In addition to playing back existing 3D content, the D2000 has a highly refined 2D to 3D conversion feature which adds a new dimension to existing movies and game playing.
The D2000, available in Korea this month, will be introduced in other markets around the world in the latter part of 2011.
1 comments
Labels:
3D,
eye tracker
Friday, July 8, 2011
This video demonstrates the use of Ergoneers Dikablis Eye-Control Module used to interact with a standard LCD TV. The project was carried out in collaboration with Technical University of Munich and reminds me much of an ongoing research project at ITU Copenhagen.
Gliding and Saccadic Gaze Gesture Recognition in Real Time (Rozado, 2011)
David Rozado with the Department of Neural Computation at the Universidad Autonoma de Madrid have developed a neural network approach for detecting gaze gestures in real time. I met David at ITU Copenhagen last summer when he was visiting and discussed this research, I'm happy to see that it came out with such great results. This research was part of Davids Ph.D thesis which focused on Hierarchical Temporal Memory (HTM) neural network which is a bioinspired pattern recognition algorithm. Using a low cost webcam and the ITU Gaze Tracker he is able to recognize ten different gestures with 90% accuracy using raw data. When a fixation detection algorithm and dwell time triggers are employed it is possible to achieve 100% detection rates (at the expense of longer activation times).
0
comments
Labels:
dwell time,
gazetracker,
hci
Friday, July 1, 2011
Eyetrax webcam eye tracker from Carnegie Mellon
"Eyetrax is dynamic eye tracking software that uses a simple stationary web camera to detect eye movement. It can be used as a motionless computer interface and is especially useful when working with ALS patients. Additionally, the non-obtrusive nature of the program allows it to work perfectly to discretely generate hotspot maps for marketing purposes". The system is developed by Joseph Fernandez, Skylar Roebuck and Jonathon Smereka and was demonstrated at the Multimedia Computing Demos on May 3rd at Carnegie Mellon.
Utechzone demos
Recently Taiwanese Utechzone demonstrated a little game at Taipei Computex 2011.
Utechzone also demonstrated a driver fatigue detection system which is housed in a smaller formfactor. This system tracks the eye (open/closed) but doesn't perform gaze estimation. The video also shows the underlying gaze tracking system used in their Spring system which appears to have some issues with glasses.
Fast forward to 1 minute in
Utechzone also demonstrated a driver fatigue detection system which is housed in a smaller formfactor. This system tracks the eye (open/closed) but doesn't perform gaze estimation. The video also shows the underlying gaze tracking system used in their Spring system which appears to have some issues with glasses.
Fast forward to 1 minute in
Wednesday, June 29, 2011
Mobile gaze-based screen interaction in 3D environments (D. Mardanbeigi, 2011)
Diako Mardanbeigi presents a method that enables the user to interact with any planar digital display in a 3D environment using a head-mounted eye tracker. An effective method for identifying the screens in the field of view of the user is also presented which can be applied in a general scenario in which multiple users can interact with multiple screens. Diakos PhD project at ITU Copenhagen concerns mobile gaze-based interaction.
Related publication:
Related publication:
Monday, June 27, 2011
Setscan EyeLock - Law enforcement training system
Setscan, a Canadian supplier of training equipment for law enforcement and military have partnered with Arrington Research to develop a binocular headmounted system with associated software called Eye Lock. The system aims at evaluating and optimizing officers allocation of visual attention. Looking at the right thing is obviously important as milliseconds count when guns are drawn. The eye tracking system is the same as those used for any natural-scene perception research but the market adaptation and focus to meet the needs of a specific domain is interesting.
UCSF using eye tracking to detect early stages of neurodegeneration
Sabes Lab at University of California, San Francisco are using high speed eye tracking systems to study eye movements as a tool for detecting neurodegenerative diseases. The data collected including response time, fixation accuracy and saccade velocity. These are important parameters that could identify approaching or existing neurodegenerative conditions such as Alzheimer. This area holds a great market potential and is feasible in a near future as the remote systems are coming closer to meeting the requirements of tracker speed and accuracy.
Tuesday, June 21, 2011
The EyeHarp: An Eye Tracking Based Musical Instrument
The main goal of the Zacharias Vamvakousis EyeHarp project is to allow people with paralysis resulting from Amyotrophic Lateral Sclerosis to play music using only their eyes. To build this, Zacharias was inspired by the EyeWriter open source initiative: "...a low-cost eye-tracking apparatus & custom software that allows graffiti writers and artists with paralysis resulting from Amyotrophic lateral sclerosis to draw using only their eyes". Zacharias spent only 50 euros to build his eye tracker using a modified version of the Sony PS3 eye camera. The application is implemented in openframeworks v0.6.
Alternatively, the instrument can be controlled using the mouse pointer (MouseHarp version). Then the free software camera mouse can be used to control the instrument with head movements. Any technology that can take control of the mouse pointer can be used in order to control the instrument. That way the mouseHarp could be an appropriate instrument for many cases of people with physical disabilities. The mouseHarp version is completely independent from the eyeWriter project. Combining the mouseHarp source with the source of the eyeWriter project, we get the eyeHarp! A low-cost gaze controlled musical instrument! Both versions are free and open source.
The EyeHarp project is part of Zacharias master thesis in Sound And Music Computing in UPF, Barcelona. His supervisor is Rafael Ramirez.
A paper on the application has been published:
- Vamvakousis Z., Ramirez R. (2011) The Eyeharp: Aa Eye-Tracking-based Musical Instrument. SMC Conference 2011, Padova, Italy (PDF)
Tuesday, June 7, 2011
Grinbath's EyeGuide
Texas based Grinbath recently announced the EyeGuide head mounted tracker. It's main competitive advantage is the low cost $1495, academic discounts are available ($1,179). The device captures eye images using a wireless camera, running on three AAA batteries, and streams these to a computer for processing. The package includes basic software for analysis and visualization. See the whitepaper for more information.
1 comments
Labels:
eye tracker,
head mounted
Monday, June 6, 2011
Proceedings from Novel Gaze-Controlled Applications 2011 online
The proceedings from the Novel Gaze-Controlled Applications 2011 conference are now available online. The conference that took place at the Blekinge Institute of Technology in Sweden during May 26-27 presented 11 talks covering a wide range of topics from gaming and gaze interaction to eye tracking solutions. Unfortunately I was unable to attend but luckily I'll have a couple of days interesting reading ahead. Kudos to Veronica Sundstedt and Charlotte Sennersten for organizing the event.
- Hyakunin-Eyesshu: a tabletop Hyakunin-Isshu game with computer opponent by the action prediction based on gaze detection
Michiya Yamamoto, Munehiro Komeda, Takashi Nagamatsu, Tomio Watanabe
Full text: PDF Online
- Gaze and voice controlled drawing
Jan van der Kamp, Veronica Sundstedt
Full text: PDF Online
- Eye tracking within the packaging design workflow: interaction with physical and virtual shelves
Chip Tonkin, Andrew D. Ouzts, Andrew T. Duchowski
Full text: PDF Online
- Designing gaze-supported multimodal interactions for the exploration of large image collections
Sophie Stellmach, Sebastian Stober, Andreas Nürnberger, Raimund Dachselt
Full text: PDF Online
- Comparison of gaze-to-objects mapping algorithms
Oleg Špakov
Full text: PDF Online
- Evaluation of a remote webcam-based eye tracker
Henrik Skovsgaard, Javier San Agustin, Sune Alstrup Johansen, John Paulin Hansen, Martin Tall
Full text: PDF Online
- An open-source low-cost eye-tracking system for portable real-time and offline tracking
Nicolas Schneider, Peter Bex, Erhardt Barth, Michael Dorr
Full text: PDF Online
- Gaze interaction from bed
John Paulin Hansen, Javier San Augustin, Henrik Skovsgaard
Full text: PDF Online
- Mobile gaze-based screen interaction in 3D environments
Diako Mardanbegi, Dan Witzner Hansen
Full text: PDF Online
- Towards intelligent user interfaces: anticipating actions in computer games
Hendrik Koesling, Alan Kenny, Andrea Finke, Helge Ritter, Seamus McLoone, Tomas Ward
Full text: PDF Online
- Exploring interaction modes for image retrieval
Corey Engelman, Rui Li, Jeff Pelz, Pengcheng Shi, Anne Haake
Full text: PDF Online
0
comments
Labels:
conference
Monday, May 9, 2011
"Read my Eyes" - A presentation of the ITU Gaze Tracker
During the last month the guys at IT University of Copenhagen has been involved in the making of a video that's intended to introduce the ITU Gaze Tracker, an open source eye tracker, to a wider audience. The production has been carried out in collaboration with the Communication Department at the university and features members of the group, students of the HCI class and Birger Bergmann Jeppesen who has had ALS since 1996. Many thanks to all involved, especially Birger & co for taking interest and participating in evaluation of the system.
1 comments
Labels:
eye tracker,
inspiration,
ITU,
low cost,
open source,
typing
Monday, May 2, 2011
1st International Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction
During the UbiComp 2011 in Beijing in September the 1st International Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction (PETMEI) will be held. Keynote speaker is Jeff B. Pelz who has considerable experience with eye tracking during natural tasks. The call for paper is out, see details below.
"Recent developments in mobile eye tracking equipment and automated eye movement analysis point the way toward unobtrusive eye-based human-computer interfaces that are pervasively usable in everyday life. We call this new paradigm pervasive eye tracking – continuous eye monitoring and analysis 24/7. The potential applications for the ability to track and analyze eye movements anywhere and anytime call for new research to further develop and understand visual behaviour and eye-based interaction in daily life settings. PETMEI 2011 will focus on pervasive eye tracking as a trailblazer for mobile eye-based interaction and eye-based context-awareness. We provide a forum for researchers from human-computer interaction, context-aware computing, and eye tracking to discuss techniques and applications that go beyond classical eye tracking and stationary eye-based interaction. We want to stimulate and explore the creativity of these communities with respect to the implications, key research challenges, and new applications for pervasive eye tracking in ubiquitous computing. The long-term goal is to create a strong interdisciplinary research community linking these fields together and to establish the workshop as the premier forum for research on pervasive eye tracking."
Important Dates
- Paper Submission: May 30, 2011
- Notification of Acceptance: June 27, 2011
- Camera-ready due: July 11, 2011
- Workshop: September 18, 2011
Topics
Topics of interest cover computational methods, new applications and use cases, as well as eye tracking technology for pervasive eye tracking and mobile eye-based interaction. Topics of interest include, but are not limited to:
Methods
Topics of interest cover computational methods, new applications and use cases, as well as eye tracking technology for pervasive eye tracking and mobile eye-based interaction. Topics of interest include, but are not limited to:
Methods
- Computer vision tools for face, eye detection and tracking
- Pattern recognition/machine learning for gaze and eye movement analysis
- Integration of pervasive eye tracking and context-aware computing
- Real-time multi-modality sensor fusion
- Techniques for eye tracking on portable devices
- Methods for long-term gaze and eye movement monitoring and analysis
- Gaze modeling for development of conversational agents
- Evaluation of context-aware systems and interfaces
- User studies on impact of and user experience with pervasive eye tracking
- Visual and non-visual feedback for eye-based interfaces
- Interaction techniques including multimodal approaches
- Analysis and interpretation of attention in HCI
- Dual and group eye tracking
Applications
- Mobile eye-based interaction with public displays, tabletops, and smart environments
- Eye-based activity and context recognition
- Pervasive healthcare, e.g. mental health monitoring or rehabilitation
- Autism research
- Daily life usability studies and market research
- Mobile attentive user interfaces
- Security and privacy for pervasive eye tracking systems
- Eye tracking in automotive research
- Eye tracking in multimedia research
- Assistive systems, e.g. mobile eye-based text entry
- Mobile eye tracking and interaction for augmented and virtual reality
- Eye-based human-robot and human-agent interaction
- Cognition-aware systems and user interfaces
- Human factors in mobile eye-based interaction
- Eye movement measures in affective computing
Technologies
- New devices for portable and wearable eye tracking
- Extension of existing systems for mobile interaction
See the submission details for more information.
0
comments
Labels:
conference
Friday, April 29, 2011
GazeGroup's Henrik Skovsgaard wins "Stars with brains" competiton
During the Danish Research Day 2011 Henrik Skovsgaard, PhD candidate at @ ITU Copenhagen, won the competition "Stars with Brains" (Stjerner med hjerner). Several high profile individuals (stars) were present including the Minister of Science, Princess Marie and Mayor Frank Jensen. The competition consisted of eight doctoral students (brains) from universities across Denmark who presented their research in a layman terms. The audience voted on their favorite candidate using SMS messaging whereby a panel of judges evaluated the participants. Later in the day Henrik was invited to an interview on the Aftenshow on national TV. Henriks research at the IT University of Copenhagen focuses primarily on gaze-based interaction as a communication tool for disabled and have participated in the development of the Gazegroup.org software. A big congrats to Henrik for the award, excellent public outreach and associated stardom!
PhD student Henrik Skovsgaard won the "Stars with brains". Photo: Tariq Mikkel Khan (source)
From right: Mayor Frank Jensen, HRH Princess Marie and Minister of Science Charlotte Sahl-Madsen. Photo: Tariq Mikkel Khan (source)
0
comments
Labels:
ITU
Wednesday, April 27, 2011
Specs for SMI GazeWear released
The specifications for the SMI GazeWear has just been announced. The head mounted tracker takes the shape of a pair of glasses and has a impressive set of features. It offers 30Hz binocular tracking (both eyes) at 0.5 deg accuracy with automatic parallax compensation for accurate gaze estimation over distances above 40cm. The dark pupil, corneal reflection based system has a tracking range of 70° horizontal / 55°. vertical angle. SMI has managed to squeeze in a HD scene camera located in the center of the frame which offers 1280x960 resolution at 30 frames per second. However, the viewing angle is slightly smaller than the tracking range at 63° horizontal and 41° vertical angle. The weight of the device is specified to 75 grams with the dimensions of 173x58x168mm (w/h/d) and is estimated to fit subjects above age 7.
SMI GazeWear
A mobile recording unit is offered which stores data on a SD card, weighs 420 grams, and has minimum of 40 minutes recording time. However, a subnotebook can be used to extend recording time towards two hours.
With the new tracker SMI seriously improves their offering in the head mounted segment with a form factor that certainly appears more attractive to a wide range of applications. The specs stands up well against the Tobii glasses which has a similar form but is limited to monocular tracking and a lower resolution scene camera. No details on availability is provided other than "coming soon", something we heard since late December. Once they are out the game is on.
The flyer may be downloaded as pdf.
0
comments
Labels:
eye tracker,
head mounted
Tuesday, April 26, 2011
Development of a head-mounted, eye-tracking system for dogs (Williams et al, 2011)
Fiona Williams, Daniel Milss and Kun Guo at the University of Lincoln have developed a head mounted eye tracking system for our four legged friends. Using a special construct based on a head strap and a muzzle the device was mounted on the head of the dog where a dichroic mirror placed in front of one of the eyes reflects the IR image back to the camera.
The device was adapted from a VisionTrack system by IScan/Polhemus and contains two miniature cameras, one for the eye and one for the scene which is connected to a host workstation. When used with human subject such setup provides 0.3 deg. of accuracy according to the manufacturer. Williams et al obtained an accuracy of 2-3 deg. from a single dog when using a special calibration method containing five points located on a cross which was mounted at the tip of the muzzle. Using positive reenforcement the dog was gradually trained to wear and fixate targets which I'm sure wasn't an easy task.
Abstract:
Growing interest in canine cognition and visual perception has promoted research into the allocation of visual attention during free-viewing tasks in the dog. The techniques currently available to study this (i.e. preferential looking) have, however, lacked spatial accuracy, permitting only gross judgements of the location of the dog’s point of gaze and are limited to a laboratory setting. Here we describe a mobile, head-mounted, video-based, eye-tracking system and a procedure for achieving standardised calibration allowing an output with accuracy of 2–3◦. The setup allows free movement of dogs; in addition the procedure does not involve extensive training skills, and is completely non-invasive. This apparatus has the potential to allow the study of gaze patterns in a variety of research applications and could enhance the study of areas such as canine vision, cognition and social interactions.
Abstract:
Growing interest in canine cognition and visual perception has promoted research into the allocation of visual attention during free-viewing tasks in the dog. The techniques currently available to study this (i.e. preferential looking) have, however, lacked spatial accuracy, permitting only gross judgements of the location of the dog’s point of gaze and are limited to a laboratory setting. Here we describe a mobile, head-mounted, video-based, eye-tracking system and a procedure for achieving standardised calibration allowing an output with accuracy of 2–3◦. The setup allows free movement of dogs; in addition the procedure does not involve extensive training skills, and is completely non-invasive. This apparatus has the potential to allow the study of gaze patterns in a variety of research applications and could enhance the study of areas such as canine vision, cognition and social interactions.
- Fiona J. Williams, Daniel S. Mills, Kun Guo, Development of a head-mounted, eye-tracking system for dogs, Journal of Neuroscience Methods, Volume 194, Issue 2, 15 January 2011, Pages 259-265, ISSN 0165-0270, DOI: 10.1016/j.jneumeth.2010.10.022. (available from ScienceDirect)
0
comments
Labels:
eye tracker,
head mounted,
inspiration,
prototype,
technology
Wednesday, April 20, 2011
Fraunhofer CMOS-OLED Headmounted display with integrated eye tracker
"The Fraunhofer IPMS works on the integration of sensors and microdisplays on CMOS backplane for several years now. For example the researchers have developed a bidirectional microdisplay, which could be used in Head-Mounted Displays (HMD) for gaze triggered augmented-reality (AR) aplications. The chips contain both an active OLED matrix and therein integrated photodetectors. The combination of both matrixes in one chip is an essential possibility for system integrators to design smaller, lightweight and portable systems with both functionalities." (Press release)
"Rigo Herold, PhD student at Fraunhofer IPMS and participant of the development team, declares: This unique device enables the design of a new generation of small AR-HMDs with advanced functionality. The OLED microdisplay based Eyetracking HMD enables the user on the one hand to overlay the view of the real world with virtual contents, for example to watch videos at jog. And on the other hand the user can select the next video triggered only by his gaze without using his hands." (Press release)Sensor integrates both OLED display and CMOS imaging sensor.
Rigo Herold will present the system at the SID 2011 exhibitor forum at May 17, 2011 4:00 p.m.: Eyecatcher: The Bi-Directional OLED Microdisplay with the following specs:
- Monochrome
- Special Eyetracking-Algorithm for HMDs based on bidirectional microdisplays
- Front brightness: > 1500 cd/m²
Poster was presented at ISSCC 2011 : Industry Demonstration Session (IDS). Click to enlarge
In addition there is a paper titled "Bidirectional OLED microdisplay: Combining display and image sensor functionality into a monolithic CMOS chip" published with the following abstract:.
"Microdisplays based on organic light-emitting diodes (OLEDs) achieve high optical performance with excellent contrast ratio and large dynamic range at low power consumption. The direct light emission from the OLED enables small devices without additional backlight, making them suitable for mobile near-to-eye (NTE) applications such as viewfinders or head-mounted displays (HMD). In these applications the microdisplay acts typically as a purely unidirectional output device [1–3]. With the integration of an additional image sensor, the functionality of the microdisplay can be extended to a bidirectional optical input/output device. The major aim is the implementation of eye-tracking capabilities in see-through HMD applications to achieve gaze-based human-display-interaction." Available at IEEE Xplore.
Subscribe to:
Posts (Atom)