"In-car interactive technology is becoming ubiquitous and cars are increasingly connected to the outside world. Drivers and passengers use this technology because it provides valuable services. Some technology, such as collision warning systems, assists drivers in performing their primary in-vehicle task (driving). Other technology provides information on myriad subjects or offers entertainment to the driver and passengers.
The challenge that arises from the proliferation of in-car devices is that they may distract drivers from the primary task of driving, with possibly disastrous results. Thus, one of the major goals of this conference is to explore ways in which in-car user interfaces can be designed so as to lessen driver distraction while still enabling valuable services. This is challenging, especially given that the design of in-car devices, which was historically the responsibility of car manufacturers and their parts suppliers, is now a responsibility shared among a large and ever-changing group of parties. These parties include car OEMs, Tier 1 and Tier 2 suppliers of factory-installed electronics, as well as the manufacturers of hardware and software that is brought into the car, for example on personal navigation devices, smartphones, and tablets.
As we consider driving safety, our focus in designing in-car user interfaces should not be purely on eliminating distractions. In-car user interfaces also offer the opportunity to improve the driver¹s performance, for example by increasing her awareness of upcoming hazards. They can also enhance the experience of all kinds of passengers in the car. To this end, a further goal of AutomotiveUI 2011 is the exploration of in-car interfaces that address the varying needs of different types of users (including disabled drivers, elderly drivers or passengers, and the users of rear-seat entertainment systems). Overall our goal is to advance the state of the art in vehicular user experiences, in order to make cars both safer and more enjoyable places to spend time." http://www.auto-ui.org
Topics include, but are not limited to:
* new concepts for in-car user interfaces
* multimodal in-car user interfaces
* in-car speech and audio user interfaces
* text input and output while driving
* multimedia interfaces for in-car entertainment
* evaluation and benchmarking of in-car user interfaces
* assistive technology in the vehicular context
* methods and tools for automotive user interface research
* development methods and tools for automotive user interfaces
* automotive user interface frameworks and toolkits
* detecting and estimating user intentions
* detecting/measuring driver distraction and estimating cognitive load
* biometrics and physiological sensors as a user interface component
* sensors and context for interactive experiences in the car
* user interfaces for information access (search, browsing, etc.) while driving
* user interfaces for navigation or route guidance
* applications and user interfaces for inter-vehicle communication
* in-car gaming and entertainment
* different user groups and user group characteristics
* in-situ studies of automotive user interface approaches
* general automotive user experience research
* driving safety research using real vehicles and simulators
* subliminal techniques for workload reduction
SUBMISSIONS
AutomotiveUI 2011 invites submissions in the following categories:
* Papers (Submission Deadline: July 11th, 2011)
* Workshops (Submission Deadline: July 25th, 2011)
* Posters & Interactive Demos (Submission Deadline: Oct. 10th, 2011)
* Industrial Showcase (Submission Deadline: Oct. 10th, 2011)
For more information on the submission categories please check http://www.auto-ui.org/11/submit.php
Monday, April 18, 2011
Thursday, April 7, 2011
FaceAPI signs licence deal with Chinese SuperD
Remember the glasses-free 3D displays demonstrated earlier this year at CES2011? Seeing Machines recently announced a production licence deal with Chinese Shenzhen Super Perfect Optics Limited (SuperD). The two companies have been working together for the last 12 months and the first consumer products are expected to be available during the summer. Big ambition, millions of devices including laptops, monitors and all-in-one-PCs by big name manufacturers. Interesting development as they know eye tracking too, please make that happen. Press release available here.
SMI iView X SDK 3.0 released
SMI just released version 3.0 of their Software Development Kit (SDK) which contains low and high level functions, documentation and sample code (matlab, e-prime, c/c++, Python and C#). The SDK supports Windows XP, Vista and 7 (both 32 and 64 bit). Available by for free for existing customers. Good news for developers, especially the 64-bit version for Windows 7. Releasing extensive and well documented SDKs for free is a trend that has been adopted by most manufacturers by now, it just makes perfect sense.
0
comments
Labels:
SMI IView RED
Monday, March 14, 2011
Mirametrix acquired by TandemLaunch Technologies
MONTREAL (Quebec), February 18, 2011 – TandemLaunch Technologies today announced that it has completed the acquisition of all assets and staff of Vancouver-based Mirametrix Research Inc., a privately held provider of gaze tracking technology. Mirametrix is a technology company offering affordable gaze tracking systems for application in vision research and content analytics. The technology acquired through Mirametrix complements TandemLaunch’s consumer gaze tracking portfolio. Terms of the transaction were not disclosed.
“Mirametrix is an innovative small company that has successfully introduced gaze tracking solutions for cost-competitive applications. TandemLaunch offers the resources to scale the Mirametrix business and ultimately bring gaze tracking into the consumer market” said Helge Seetzen, CEO of TandemLaunch.
The website has been updated revealing the new executive team, product offering appears to remain the same for the time being. Helge Seetzen (blog) is an entrepreneur who sold his previous company, BrightSide, to Dolby Technologies for ~$30 million which he invested into the TandemLaunch incubator which focuses on early stages in technology development with the aim bring in industry partners to acquire the technology for further commercialization (interview).
Congrats to Craig Hennessey, founder of Mirametrix, who is now well on his way to commercialize his PhD research (1, 2) on remote eye tracking based on a single camera setup, bright pupil and corneal reflections. It will be interesting to see how additional resources backing the operation will affects the industry and what role an affordable but perhaps less accurate system has to play. The latter can be improved upon but what about the market, will an affordable system expand or create new segments? From the top of my head, Yes. Time will tell..
Wednesday, March 9, 2011
Sunday, March 6, 2011
Thursday, March 3, 2011
CUShop concept @ Clemson University
"Clemson University architecture students are working with the packaging science department in designing an eye tracking lab to be a fully immersive grocery store shopping experience. This concept explores the entrance into the lab through a vestibule space created by two sliding glass doors, mimicking the space found in many grocery stores."
0
comments
Labels:
mobility
Wednesday, March 2, 2011
Accurate eye center localisation for low-cost eye tracking
Fabian Timm from the Lübeck University Institute for Neuro and Bioinformatics demonstrate a "novel approach for accurate localisation of the eye centres (pupil) in real time. In contrast to other approaches, we neither employ any kind of machine learning nor a model scheme - we just compute dot products! Our method computes very accurate estimations and can therefore be used in real world applications such as eye (gaze) tracking." Sounds great, any ideas on gaze estimation and accuracy?
0
comments
Labels:
eye tracker,
low cost
Head-mounted eye-tracking application for driving
Nicolas Schneider have for his masters thesis modified the ITU Gaze Tracker for eye tracking in an automotive setting. It incorporates a scene camera and software that calibrates and integrates it in the platform. The project was carried out at Schepens Eye Research Institute at Harvard and there is a good chance it will be released open source. A fine piece of work and an awesome addition to the framework. We're impressed by the results. More info to follow, for now enjoy this video.
- Nicolas Schneider, Peter Bex, Erhardt Barth, and Michael Dorr. 2011. An open-source low-cost eye-tracking system for portable real-time and offline tracking. In Proceedings of the 1st Conference on Novel Gaze-Controlled Applications (NGCA '11). ACM, New York, NY, USA, , Article 8 , 4 pages. (Full text: PDF Online)
0
comments
Labels:
eye tracker,
gazetracker,
ITU,
low cost,
prototype
Wednesday, February 16, 2011
A self-calibrating, camera-based eye tracker for the recording of rodent eye movements (Zoccolan et al, 2010)
Came across an interesting methods article in "Frontiers in Neuroscience" published in late November last year which involves the development of a fully automated eye tracking system which is calibrated without requiring co-operation from the subject. This is done by fixing the location of the eye and moving the camera to establish a geometric model (also see Stahl et al, 2000, 2004). Apparently they attempted to use a commercial EyeLink II device first but found it not suitable for rodent eye tracking due to thresholding implementation, illumination conditions and failing corneal reflection tracking when the rodent was chewing. So the authors built their own solution using a Prosilica camera and a set of algorithms (depicted below). Read the paper for implementation details. I find it to be a wonderful piece of work, different from human eye tracking for sure but still relevant and fascinating.
Abstract:
"Much of neurophysiology and vision science relies on careful measurement of a human or animal subject’s gaze direction. Video-based eye trackers have emerged as an especially popular option for gaze tracking, because they are easy to use and are completely non-invasive. However, video eye trackers typically require a calibration procedure in which the subject must look at a series of points at known gaze angles. While it is possible to rely on innate orienting behaviors for calibration in some non-human species, other species, such as rodents, do not reliably saccade to visual targets, making this form of calibration impossible. To overcome this problem, we developed a fully automated infrared video eye-tracking system that is able to quickly and accurately calibrate itself without requiring co-operation from the subject. This technique relies on the optical geometry of the cornea and uses computer-controlled motorized stages to rapidly estimate the geometry of the eye relative to the camera. The accuracy and precision of our system was carefully measured using an artificial eye, and its capability to monitor the gaze of rodents was verified by tracking spontaneous saccades and evoked oculomotor reflexes in head-fixed rats (in both cases, we obtained measurements that are consistent with those found in the literature). Overall, given its fully automated nature and its intrinsic robustness against operator errors, we believe that our eye-tracking system enhances the utility of existing approaches to gaze-tracking in rodents and represents a valid tool for rodent vision studies."
Schematic diagram of the eye-tracking system
Illustration of the algorithm to track the eye’s pupil and corneal
reflection spot.
Eye coordinate system and measurements
Horizontal and vertical alignment of the eye with the center of
the camera’s sensor.
Abstract:
"Much of neurophysiology and vision science relies on careful measurement of a human or animal subject’s gaze direction. Video-based eye trackers have emerged as an especially popular option for gaze tracking, because they are easy to use and are completely non-invasive. However, video eye trackers typically require a calibration procedure in which the subject must look at a series of points at known gaze angles. While it is possible to rely on innate orienting behaviors for calibration in some non-human species, other species, such as rodents, do not reliably saccade to visual targets, making this form of calibration impossible. To overcome this problem, we developed a fully automated infrared video eye-tracking system that is able to quickly and accurately calibrate itself without requiring co-operation from the subject. This technique relies on the optical geometry of the cornea and uses computer-controlled motorized stages to rapidly estimate the geometry of the eye relative to the camera. The accuracy and precision of our system was carefully measured using an artificial eye, and its capability to monitor the gaze of rodents was verified by tracking spontaneous saccades and evoked oculomotor reflexes in head-fixed rats (in both cases, we obtained measurements that are consistent with those found in the literature). Overall, given its fully automated nature and its intrinsic robustness against operator errors, we believe that our eye-tracking system enhances the utility of existing approaches to gaze-tracking in rodents and represents a valid tool for rodent vision studies."
- Zoccolan DF, Graham BJ, Cox DD (2010) A self-calibrating, camera-based eye tracker for the recording of rodent eye movements. Frontiers in Neuroscience Methods. doi:10.3389/fnins.2010.00193 [link]
1 comments
Labels:
3D,
eye tracker,
inspiration,
technology
Subscribe to:
Posts (Atom)