Tuesday, April 26, 2011

Development of a head-mounted, eye-tracking system for dogs (Williams et al, 2011)

Fiona Williams, Daniel Milss and Kun Guo at the University of Lincoln have developed a head mounted eye tracking system for our four legged friends. Using a special construct based on a head strap and a muzzle the device was mounted on the head of the dog where a dichroic mirror placed in front of one of the eyes reflects the IR image back to the camera.


The device was adapted from a VisionTrack system by IScan/Polhemus and contains two miniature cameras, one for the eye and one for the scene which is connected to a host workstation. When used with human subject such setup provides 0.3 deg. of accuracy according to the manufacturer. Williams et al obtained an accuracy of 2-3 deg. from a single dog when using a special calibration method containing five points located on a cross which was mounted at the tip of the muzzle. Using positive reenforcement the dog was gradually trained to wear and fixate targets which I'm sure wasn't an easy task.


Abstract:
Growing interest in canine cognition and visual perception has promoted research into the allocation of visual attention during free-viewing tasks in the dog. The techniques currently available to study this (i.e. preferential looking) have, however, lacked spatial accuracy, permitting only gross judgements of the location of the dog’s point of gaze and are limited to a laboratory setting. Here we describe a mobile, head-mounted, video-based, eye-tracking system and a procedure for achieving standardised calibration allowing an output with accuracy of 2–3◦. The setup allows free movement of dogs; in addition the procedure does not involve extensive training skills, and is completely non-invasive. This apparatus has the potential to allow the study of gaze patterns in a variety of research applications and could enhance the study of areas such as canine vision, cognition and social interactions.

  • Fiona J. Williams, Daniel S. Mills, Kun Guo, Development of a head-mounted, eye-tracking system for dogs, Journal of Neuroscience Methods, Volume 194, Issue 2, 15 January 2011, Pages 259-265, ISSN 0165-0270, DOI: 10.1016/j.jneumeth.2010.10.022. (available from ScienceDirect)

Wednesday, April 20, 2011

Fraunhofer CMOS-OLED Headmounted display with integrated eye tracker

"The Fraunhofer IPMS works on the integration of sensors and microdisplays on CMOS backplane for several years now. For example the researchers have developed a bidirectional microdisplay, which could be used in Head-Mounted Displays (HMD) for gaze triggered augmented-reality (AR) aplications. The chips contain both an active OLED matrix and therein integrated photodetectors. The combination of both matrixes in one chip is an essential possibility for system integrators to design smaller, lightweight and portable systems with both functionalities." (Press release)
"Rigo Herold, PhD student at Fraunhofer IPMS and participant of the development team, declares: This unique device enables the design of a new generation of small AR-HMDs with advanced functionality. The OLED microdisplay based Eyetracking HMD enables the user on the one hand to overlay the view of the real world with virtual contents, for example to watch videos at jog. And on the other hand the user can select the next video triggered only by his gaze without using his hands." (Press release)

Sensor integrates both OLED display and CMOS imaging sensor. 

Rigo Herold will present the system at the SID 2011 exhibitor forum at May 17, 2011 4:00 p.m.: Eyecatcher: The Bi-Directional OLED Microdisplay with the following specs:
  • Monochrome 
  • Special Eyetracking-Algorithm for HMDs based on bidirectional microdisplays
  • Front brightness: > 1500 cd/m²

Poster was presented at ISSCC 2011 : Industry Demonstration Session (IDS). Click to enlarge

In addition there is a paper titled "Bidirectional OLED microdisplay: Combining display and image sensor functionality into a monolithic CMOS chip" published with the following abstract:. 

"Microdisplays based on organic light-emitting diodes (OLEDs) achieve high optical performance with excellent contrast ratio and large dynamic range at low power consumption. The direct light emission from the OLED enables small devices without additional backlight, making them suitable for mobile near-to-eye (NTE) applications such as viewfinders or head-mounted displays (HMD). In these applications the microdisplay acts typically as a purely unidirectional output device [1–3]. With the integration of an additional image sensor, the functionality of the microdisplay can be extended to a bidirectional optical input/output device. The major aim is the implementation of eye-tracking capabilities in see-through HMD applications to achieve gaze-based human-display-interaction." Available at IEEE Xplore

Monday, April 18, 2011

AutomotiveUI'11 - 3rd International Conference On Automotive User Interfaces and Interactive Vehicular Applications

"In-car interactive technology is becoming ubiquitous and cars are increasingly connected to the outside world. Drivers and passengers use this technology because it provides valuable services. Some technology, such as collision warning systems, assists drivers in performing their primary in-vehicle task (driving). Other technology provides information on myriad subjects or offers entertainment to the driver and passengers.

The challenge that arises from the proliferation of in-car devices is that they may distract drivers from the primary task of driving, with possibly disastrous results. Thus, one of the major goals of this conference is to explore ways in which in-car user interfaces can be designed so as to lessen driver distraction while still enabling valuable services. This is challenging, especially given that the design of in-car devices, which was historically the responsibility of car manufacturers and their parts suppliers, is now a responsibility shared among a large and ever-changing group of parties. These parties include car OEMs, Tier 1 and Tier 2 suppliers of factory-installed electronics, as well as the manufacturers of hardware and software that is brought into the car, for example on personal navigation devices, smartphones, and tablets.

As we consider driving safety, our focus in designing in-car user interfaces should not be purely on eliminating distractions. In-car user interfaces also offer the opportunity to improve the driver¹s performance, for example by increasing her awareness of upcoming hazards. They can also enhance the experience of all kinds of passengers in the car. To this end, a further goal of AutomotiveUI 2011 is the exploration of in-car interfaces that address the varying needs of different types of users (including disabled drivers, elderly drivers or passengers, and the users of rear-seat entertainment systems). Overall our goal is to advance the state of the art in vehicular user experiences, in order to make cars both safer and more enjoyable places to spend time." http://www.auto-ui.org



Topics include, but are not limited to:
* new concepts for in-car user interfaces
* multimodal in-car user interfaces
* in-car speech and audio user interfaces
* text input and output while driving
* multimedia interfaces for in-car entertainment
* evaluation and benchmarking of in-car user interfaces
* assistive technology in the vehicular context
* methods and tools for automotive user interface research
* development methods and tools for automotive user interfaces
* automotive user interface frameworks and toolkits
* detecting and estimating user intentions
* detecting/measuring driver distraction and estimating cognitive load
* biometrics and physiological sensors as a user interface component
* sensors and context for interactive experiences in the car
* user interfaces for information access (search, browsing, etc.) while driving
* user interfaces for navigation or route guidance
* applications and user interfaces for inter-vehicle communication
* in-car gaming and entertainment
* different user groups and user group characteristics
* in-situ studies of automotive user interface approaches
* general automotive user experience research
* driving safety research using real vehicles and simulators
* subliminal techniques for workload reduction



SUBMISSIONS
AutomotiveUI 2011 invites submissions in the following categories:

* Papers (Submission Deadline: July 11th, 2011)
* Workshops (Submission Deadline: July 25th, 2011)
* Posters & Interactive Demos (Submission Deadline: Oct. 10th, 2011)
* Industrial Showcase (Submission Deadline:  Oct. 10th, 2011)

For more information on the submission categories please check http://www.auto-ui.org/11/submit.php

Thursday, April 7, 2011

FaceAPI signs licence deal with Chinese SuperD

Remember the glasses-free 3D displays demonstrated earlier this year at CES2011? Seeing Machines recently announced a production licence deal with Chinese Shenzhen Super Perfect Optics Limited (SuperD). The two companies have been working together for the last 12 months and the first consumer products are expected to be available during the summer. Big ambition, millions of devices including laptops, monitors and all-in-one-PCs by big name manufacturers. Interesting development as they know eye tracking too, please make that happen. Press release available here.

SMI iView X SDK 3.0 released

SMI just released version 3.0 of their Software Development Kit (SDK) which contains low and high level functions, documentation and sample code (matlab, e-prime, c/c++, Python and C#). The SDK supports Windows XP, Vista and 7 (both 32 and 64 bit). Available by for free for existing customers. Good news for developers, especially the 64-bit version for Windows 7. Releasing extensive and well documented SDKs for free is a trend that has been adopted by most manufacturers by now, it just makes perfect sense.

Monday, March 14, 2011

Mirametrix acquired by TandemLaunch Technologies

MONTREAL (Quebec), February 18, 2011 – TandemLaunch Technologies today announced that it has completed the acquisition of all assets and staff of Vancouver-based Mirametrix Research Inc., a privately held provider of gaze tracking technology. Mirametrix is a technology company offering affordable gaze tracking systems for application in vision research and content analytics. The technology acquired through Mirametrix complements TandemLaunch’s consumer gaze tracking portfolio. Terms of the transaction were not disclosed.
“Mirametrix is an innovative small company that has successfully introduced gaze tracking solutions for cost-competitive applications. TandemLaunch offers the resources to scale the Mirametrix business and ultimately bring gaze tracking into the consumer market” said Helge Seetzen, CEO of TandemLaunch.
The website has been updated revealing the new executive team, product offering appears to remain the same for the time being. Helge Seetzen (blog) is an entrepreneur who sold his previous company, BrightSide, to Dolby Technologies for ~$30 million which he invested into the TandemLaunch incubator which focuses on early stages in technology development with the aim bring in industry partners to acquire the technology for further commercialization (interview). 

Congrats to Craig Hennessey, founder of Mirametrix, who is now well on his way to commercialize his PhD research (1, 2) on remote eye tracking based on a single camera setup, bright pupil and corneal reflections. It will be interesting to see how additional resources backing the operation will affects the industry and what role an affordable but perhaps less accurate system has to play. The latter can be improved upon but what about the market, will an affordable system expand or create new segments? From the top of my head, Yes. Time will tell..

Wednesday, March 9, 2011

Sunday, March 6, 2011

Thursday, March 3, 2011

CUShop concept @ Clemson University

"Clemson University architecture students are working with the packaging science department in designing an eye tracking lab to be a fully immersive grocery store shopping experience. This concept explores the entrance into the lab through a vestibule space created by two sliding glass doors, mimicking the space found in many grocery stores."





Wednesday, March 2, 2011

Accurate eye center localisation for low-cost eye tracking

Fabian Timm from the Lübeck University Institute for Neuro and Bioinformatics demonstrate a "novel approach for accurate localisation of the eye centres (pupil) in real time. In contrast to other approaches, we neither employ any kind of machine learning nor a model scheme - we just compute dot products! Our method computes very accurate estimations and can therefore be used in real world applications such as eye (gaze) tracking." Sounds great, any ideas on gaze estimation and accuracy?