"Clemson University architecture students are working with the packaging science department in designing an eye tracking lab to be a fully immersive grocery store shopping experience. This concept explores the entrance into the lab through a vestibule space created by two sliding glass doors, mimicking the space found in many grocery stores."
Thursday, March 3, 2011
Wednesday, March 2, 2011
Accurate eye center localisation for low-cost eye tracking
Fabian Timm from the Lübeck University Institute for Neuro and Bioinformatics demonstrate a "novel approach for accurate localisation of the eye centres (pupil) in real time. In contrast to other approaches, we neither employ any kind of machine learning nor a model scheme - we just compute dot products! Our method computes very accurate estimations and can therefore be used in real world applications such as eye (gaze) tracking." Sounds great, any ideas on gaze estimation and accuracy?
0
comments
Labels:
eye tracker,
low cost
Head-mounted eye-tracking application for driving
Nicolas Schneider have for his masters thesis modified the ITU Gaze Tracker for eye tracking in an automotive setting. It incorporates a scene camera and software that calibrates and integrates it in the platform. The project was carried out at Schepens Eye Research Institute at Harvard and there is a good chance it will be released open source. A fine piece of work and an awesome addition to the framework. We're impressed by the results. More info to follow, for now enjoy this video.
- Nicolas Schneider, Peter Bex, Erhardt Barth, and Michael Dorr. 2011. An open-source low-cost eye-tracking system for portable real-time and offline tracking. In Proceedings of the 1st Conference on Novel Gaze-Controlled Applications (NGCA '11). ACM, New York, NY, USA, , Article 8 , 4 pages. (Full text: PDF Online)
0
comments
Labels:
eye tracker,
gazetracker,
ITU,
low cost,
prototype
Wednesday, February 16, 2011
A self-calibrating, camera-based eye tracker for the recording of rodent eye movements (Zoccolan et al, 2010)
Came across an interesting methods article in "Frontiers in Neuroscience" published in late November last year which involves the development of a fully automated eye tracking system which is calibrated without requiring co-operation from the subject. This is done by fixing the location of the eye and moving the camera to establish a geometric model (also see Stahl et al, 2000, 2004). Apparently they attempted to use a commercial EyeLink II device first but found it not suitable for rodent eye tracking due to thresholding implementation, illumination conditions and failing corneal reflection tracking when the rodent was chewing. So the authors built their own solution using a Prosilica camera and a set of algorithms (depicted below). Read the paper for implementation details. I find it to be a wonderful piece of work, different from human eye tracking for sure but still relevant and fascinating.
Abstract:
"Much of neurophysiology and vision science relies on careful measurement of a human or animal subject’s gaze direction. Video-based eye trackers have emerged as an especially popular option for gaze tracking, because they are easy to use and are completely non-invasive. However, video eye trackers typically require a calibration procedure in which the subject must look at a series of points at known gaze angles. While it is possible to rely on innate orienting behaviors for calibration in some non-human species, other species, such as rodents, do not reliably saccade to visual targets, making this form of calibration impossible. To overcome this problem, we developed a fully automated infrared video eye-tracking system that is able to quickly and accurately calibrate itself without requiring co-operation from the subject. This technique relies on the optical geometry of the cornea and uses computer-controlled motorized stages to rapidly estimate the geometry of the eye relative to the camera. The accuracy and precision of our system was carefully measured using an artificial eye, and its capability to monitor the gaze of rodents was verified by tracking spontaneous saccades and evoked oculomotor reflexes in head-fixed rats (in both cases, we obtained measurements that are consistent with those found in the literature). Overall, given its fully automated nature and its intrinsic robustness against operator errors, we believe that our eye-tracking system enhances the utility of existing approaches to gaze-tracking in rodents and represents a valid tool for rodent vision studies."
Schematic diagram of the eye-tracking system
Illustration of the algorithm to track the eye’s pupil and corneal
reflection spot.
Eye coordinate system and measurements
Horizontal and vertical alignment of the eye with the center of
the camera’s sensor.
Abstract:
"Much of neurophysiology and vision science relies on careful measurement of a human or animal subject’s gaze direction. Video-based eye trackers have emerged as an especially popular option for gaze tracking, because they are easy to use and are completely non-invasive. However, video eye trackers typically require a calibration procedure in which the subject must look at a series of points at known gaze angles. While it is possible to rely on innate orienting behaviors for calibration in some non-human species, other species, such as rodents, do not reliably saccade to visual targets, making this form of calibration impossible. To overcome this problem, we developed a fully automated infrared video eye-tracking system that is able to quickly and accurately calibrate itself without requiring co-operation from the subject. This technique relies on the optical geometry of the cornea and uses computer-controlled motorized stages to rapidly estimate the geometry of the eye relative to the camera. The accuracy and precision of our system was carefully measured using an artificial eye, and its capability to monitor the gaze of rodents was verified by tracking spontaneous saccades and evoked oculomotor reflexes in head-fixed rats (in both cases, we obtained measurements that are consistent with those found in the literature). Overall, given its fully automated nature and its intrinsic robustness against operator errors, we believe that our eye-tracking system enhances the utility of existing approaches to gaze-tracking in rodents and represents a valid tool for rodent vision studies."
- Zoccolan DF, Graham BJ, Cox DD (2010) A self-calibrating, camera-based eye tracker for the recording of rodent eye movements. Frontiers in Neuroscience Methods. doi:10.3389/fnins.2010.00193 [link]
1 comments
Labels:
3D,
eye tracker,
inspiration,
technology
Thursday, February 3, 2011
EyeTech Digital Systems
Arizona-based EyeTech Digital Systems offers several interesting eye trackers where the new V1 caught my attention with its extended track-box of 25 x 18 x 50cm. The rather large depth range is provided through a custom auto focus mechanism developed in cooperation with Brigham Young University Dept. of Mechanical Engineering. This makes the device particularly suitable for larger displays such as public displays/digital signage, still the I'd imaging the calibration procedure to remain, ideally you'd want to walk up and interact/collect data automatically without any wizards or intervention. In any case, a larger trackbox is always welcome and it certainly opens up new opportunities. EyeTechs V1 offers 20cm more than most.
0
comments
Labels:
assistive technology,
eye tracker,
technology
Wednesday, February 2, 2011
Thursday, January 13, 2011
Taiwanese Utechzone, the Spring gaze interaction system
UTechZone a Taiwanese company have launched the Spring gaze interaction system for individuals with ALS or similar conditions. It provides the basic functionality including text entry, email, web, media etc. in a format that reminds much of the MyTobii software. The tracker can be mounted in various ways including wheelchairs and desks with the accessories. A nice feature is the built in TV tuner which is accessible through the gaze interface. The performance of the actual tracking system and accuracy in gaze estimation is unknown, only specified to a 7x4 grid. Track-box is specified to 17cm x 10cm x 15cm with a working range of 55-70 cm.
The system runs on Windows XP and a computer equipped with an Intel Dual Core CPU, 2GB RAM, a 500GB HD combined with a 17" monitor.
Supported languages are Traditional Chinese, Simplified Chinese, English and Japanese. All countries with pretty big markets. Price unknown but probably less than a Tobii. Get the product brochure (pdf).
The system runs on Windows XP and a computer equipped with an Intel Dual Core CPU, 2GB RAM, a 500GB HD combined with a 17" monitor.
Supported languages are Traditional Chinese, Simplified Chinese, English and Japanese. All countries with pretty big markets. Price unknown but probably less than a Tobii. Get the product brochure (pdf).
Call for papers: UBICOMM 2011
"The goal of the International Conference on Mobile Ubiquitous Computing, Systems, Services and Technologies, UBICOMM 2011, is to bring together researchers from the academia and practitioners from the industry in order to address fundamentals of ubiquitous systems and the new applications related to them. The conference will provide a forum where researchers shall be able to present recent research results and new research problems and directions related to them. The conference seeks contributions presenting novel research in all aspects of ubiquitous techniques and technologies applied to advanced mobile applications." All tracks/topics are open to both research and industry contributions. More info.
Tracks:
Tracks:
- Fundamentals
- Mobility
- Information Ubiquity
- Ubiquitous Multimedia Systems and Processing
- Wireless Technologies
- Web Services
- Ubiquitous networks
- Ubiquitous devices and operative systems
- Ubiquitous mobile services and protocols
- Ubiquitous software and security
- Collaborative ubiquitous systems
- User and applications
- Submission (full paper) June 20, 2011
- Notification July 31, 2011
- Registration August 15, 2011
- Camera ready August 20, 2011
0
comments
Labels:
conference
Face tracking for 3D displays without glasses.
A number of manufacturers and research institutes have presented 3D display systems that utilizes real time face and eye region tracking in order to adjust the stereoscopic display in real time. This means that viewers doesn't have to wear any funky glasses to see the 3D content which has been a limiting factor for these displays. Some prototypes and OEM solutions were introduced at CEBIT last year. At CES2011 Toshiba presented a 3D equipped laptop that uses the built-in webcam to track the position of the users face (appears to be built around Seeingmachines faceAPI). It's an interesting development, we're seeing more and more of computer vision applications in the consumer space, recently Microsoft announced that they've sold 8 million Kinect devices in the first 60 days while Sony shipped 4.1 million Playstation Move in the first two months.
3D displays sans glasses at CEBIT2010
Toshibas 3D laptop sans glasses at CES2011.
Obviously, these systems differ from eye tracking systems but still share many concepts. So whats the limiting factor for consumer eye tracking then? 1) Lack of applications, there isn't a clear compelling reason for most consumers to get an eye tracker. It has to provide a new experience with a clear advantage and value. Doing something faster, easier or in a way that couldn't be done before. 2) Expensive hardware, they are professional devices manufactured in low volume with the use of high quality, expensive components 3) No guarantees, doesn't work for all customers in all environments. How do you sell something that only works under specific conditions for say 90% of the customers?
0
comments
Labels:
3D,
modalities
Eye HDR: gaze-adaptive system for displaying high-dynamic-range images (Rahardja et al)
"How can high dynamic range (HDR) images like those captured by human vision be most effectively reproduced? Susanto Rahardja, head of the Signal Processing Department at the A*STAR Institute for Infocomm Research (I2R), hit upon the idea of simulating the human brain’s mechanism for HDR vision. “We thought about developing a dynamic display system that could naturally and interactively adapt as the user’s eyes move around a scene, just as the human visual system changes as our eyes move around a real scene,” he says.
Two years ago, Rahardja initiated a program on HDR display bringing together researchers with a vriety of backgrounds. “We held a lot of brainstorming sessions to discuss how the human visual system perceives various scenes with different levels of brightness,” says Farzam Farbiz, a senior research fellow of the Signal Processing Department. They also read many books on cerebral physiology to understand how receptors in the retina respond to light and convert the data into electric signals, which are then transmitted to retinal ganglion cells and other neural cells through complex pathways in the visual cortex.
The EyeHDR system employs a commercial eye-tracker device that follows the viewer’s eyes and records the eyes’ reflection patterns. Using this data, the system calculates and determines the exact point of the viewer’s gaze on the screen using special ‘neural network’ algorithms the team has developed.
“On top of that, we also had to simulate the transitional latency of human eyes,” says Corey Manders, a senior research fellow of the Signal Processing Department. “When you move your gaze from a dark part of the room to a bright window, our eyes take a few moments to adjust before we can see clearly what’s outside,” adds Zhiyong Huang, head of the Computer Graphics and Interface Department. “This is our real natural experience, and our work is to reproduce this on-screen.”
The EyeHDR system calculates the average luminance of the region where the observer is gazing, and adjusts the intensity and contrast to optimal levels with a certain delay, giving the viewer the impression of a real scene. The system also automatically tone-maps the HDR images to low dynamic range (LDR) images in regions outside of the viewers gaze. Ultimately, the EyeHDR system generates multiple images in response to the viewer’s gaze, which contrasts with previous attempts to achieve HDR through the generation of a single, perfect HDR display image.
The researchers say development of the fundamental technologies for the system is close to complete, and the EyeHDR system’s ability to display HDR images on large LDR screens has been confirmed. But before the system can become commercially available, the eye-tracking devices will need to be made more accurate, robust and easier to use. As the first step toward commercialization, the team demonstrated the EyeHDR system at SIGGRAPH Asia 2009, an annual international conference and exhibition on digital content, held in Yokohama, Japan in December last year.
Although the team’s work is currently focused on static images, they have plans for video. “We would like to apply our technologies for computer gaming and other moving images in the future. We are also looking to reduce the realism gap between real and virtual scenes in emergency response simulation, architecture and science,” Farbiz says". (source)
- Susanto Rahardja, Farzam Farbiz, Corey Manders, Huang Zhiyong, Jamie Ng Suat Ling, Ishtiaq Rasool Khan, Ong Ee Ping, and Song Peng. 2009. Eye HDR: gaze-adaptive system for displaying high-dynamic-range images. In ACM SIGGRAPH ASIA 2009 Art Gallery & Emerging Technologies: Adaptation (SIGGRAPH ASIA '09). ACM, New York, NY, USA, 68-68. DOI=10.1145/1665137.1665187. (pdf, it's a one page poster)
0
comments
Labels:
attentive interface,
hci,
inspiration,
prototype
Subscribe to:
Posts (Atom)