Showing posts with label eye tracker. Show all posts
Showing posts with label eye tracker. Show all posts

Thursday, September 5, 2013

Introducing The Eye Tribe Tracker

It's with great pride I today introduce the Eye Tribe Tracker. It's the worlds smallest remote tracker, the first to use USB3.0 and the only one below $100. It's not targeting the research community, instead it aims for new grounds being developers of next-gen gaze interaction applications. I will let the academic crowd determine if it meets their requirements. I'm too biased to claim that it's better than this or that. The only way to properly evaluate eye trackers is through standardized evaluation carried out by independent parties.


On a personal level today marks an important milestone. I built my first gaze interaction software back in 2008, titled Neovisus, as the outcome of my MSc. at Lund University. During this work I realized that gaze interaction could be a natural interaction element, not just for a specific user group but for everyone. At the time eye trackers were unfortunately really hard to come by, the one I used costs $25,000 (and still does). Javier San Agustin and myself  attempted to fix this during our R&D of the ITU GazeTracker, an open source eye tracker software. In many ways we succeeded, but it lacked critical features; you had to order components to assembly your own rig, it was difficult to setup and tracking was far from robust compared to commercial alternatives.

Overall, the ITU GazeTracker was a great learning experience, it evolved to become most distributed open source eye tracking software and gathered an active community. At the same time, we learned what it would take to build something great. It would require us to focus and make a full time commitment.

Here we are two years later. With the launch of an truly affordable eye tracker we have taken a big step towards realizing the vision we are burning for. No longer is there a prohibiting barrier preventing developers from exploring the many benefits eye tracking can bring to their applications.

Best of all, this is still the beginning. I can't wait to get this into the hands of all the developers who placed a $99 bet on the future.

Tech specs (preliminary)

Sampling rate40Hz and 60Hz mode
Accuracy0.5° (average)
Spatial Resolution0.1° (RMS)
Latency<20ms at 60Hz
Calibration5, 9, 12 points
Operating range45cm – 75cm
Tracking area40cm x 40cm at 65cm distance
Screen sizesUp to 24”
API/SDKC++, C# and Java included
Data outputBinocular gaze data
Dimensions (W/H/D)20 x 1.9 x 1.6 cm (7.9 x 0.75 x 0.66 inches)
Weight130g
ConnectionUSB3.0 Superspeed


Wednesday, July 13, 2011

LG introduces the world first Glasses-Free 3D monitor with eye-tracking technology

Today LG announced a 20" LCD display with built-in "eye tracking" technology that enables glasses-free 3D imaging which moves this technology closer to the consumer market. The image below does, as far as I can tell, not reveal any infrared illuminators, a requirement for all known systems with high accuracy so it's probably more of a rough estimation system than a full-blown remote system. Best known accuracy (published research) under natural light is about 3-4 degrees of angle, with their financial resources they could potentially achieve better results. 
Left. The "special" eye tracking camera sensor. Looks like a rather typical webcam CMOS sensor to me. Unless they are doing some magic it will not allow accurate gaze estimation. Regardless, makes me wonder if 3D displays is the path by which eye tracking goes mainstream? Is this related to the collaboration between Seeing Machines and SuperD announced earlier this year or just a competing solution? Details are sparse, I'll keep you posted as it becomes available. 


Official press release:


SEOUL, July, 13, 2011 – LG Electronics (LG) today unveiled the world’s first glasses-free monitor utilizing eye-tracking technology to maintain an optimal 3D image from a range of viewing angles. The 20-inch D2000 (Korean model: DX2000) monitor was developed as a fully functional entertainment display capable of reproducing games, movies and images in all their realistic glory.

“With a full line-up of 3D TVs, laptops, projectors and smartphones, LG Electronics is by far and away the industry leader in all things 3D.” said Si-hwan Park, Vice President of the Monitor Division at LG’s Home Entertainment Company. “LG’s position has always been that 3D will and must eventually function without glasses. The D2000 is a look at what the future has in store.”

The D2000’s 3D effect comes courtesy of glasses-free parallax barrier 3D technology, and the application of the world’s first eye-tracking feature to the monitor. The combination of parallax barrier and eye-tracking in a single unit promises to open up new horizons for glasses-free 3D products.


Existing glasses-free 3D technologies generally require viewers to stay within a tightly restricted angle and distance to perceive the 3D images. However, the D2000 has done much to resolve this issue, allowing viewer much freer movement and more comfortable viewing. Eye tracking in the D2000 works via a special camera sensor attached to the monitor which detects changes in the user’s eye position in real-time. With this information, the monitor calculates the angle and position of the viewer and adjusts the displayed image for the optimal 3D effect.

In addition to playing back existing 3D content, the D2000 has a highly refined 2D to 3D conversion feature which adds a new dimension to existing movies and game playing.

The D2000, available in Korea this month, will be introduced in other markets around the world in the latter part of 2011.

Tuesday, June 7, 2011

Grinbath's EyeGuide

Texas based Grinbath recently announced the EyeGuide head mounted tracker. It's main competitive advantage is the low cost $1495, academic discounts are available ($1,179). The device captures eye images using a wireless camera, running on three AAA batteries, and streams these to a computer for processing. The package includes basic software for analysis and visualization.  See the whitepaper for more information. 

Monday, May 9, 2011

"Read my Eyes" - A presentation of the ITU Gaze Tracker

During the last month the guys at IT University of Copenhagen has been involved in the making of a video that's intended to introduce the ITU Gaze Tracker, an open source eye tracker, to a wider audience. The production has been carried out in collaboration with the Communication Department at the university and  features members of the group, students of the HCI class and Birger Bergmann Jeppesen who has had ALS since 1996. Many thanks to all involved, especially Birger & co for taking interest and participating in evaluation of the system.

Wednesday, April 27, 2011

Specs for SMI GazeWear released

The specifications for the SMI GazeWear has just been announced. The head mounted tracker takes the shape of a pair of glasses and has a impressive set of features. It offers 30Hz binocular tracking (both eyes) at 0.5 deg accuracy with automatic parallax compensation for accurate gaze estimation over distances above 40cm. The dark pupil, corneal reflection based system has a tracking range of 70° horizontal / 55°. vertical angle. SMI has managed to squeeze in a HD scene camera located in the center of the frame which offers 1280x960 resolution at 30 frames per second. However, the viewing angle is slightly smaller than the tracking range at 63° horizontal and 41° vertical angle. The weight of the device is specified to 75 grams with the dimensions of 173x58x168mm (w/h/d) and is estimated to fit subjects above age 7.

SMI GazeWear
A mobile recording unit is offered which stores data on a SD card, weighs 420 grams, and has minimum of 40 minutes recording time. However, a subnotebook can be used to extend recording time towards two hours.   

With the new tracker SMI seriously improves their offering in the head mounted segment with a form factor that certainly appears more attractive to a wide range of applications. The specs stands up well against the Tobii glasses which has a similar form but is limited to monocular tracking and a lower resolution scene camera.  No details on availability is provided other than "coming soon", something we heard since late December. Once they are out the game is on. 

The flyer may be downloaded as pdf.

Tuesday, April 26, 2011

Development of a head-mounted, eye-tracking system for dogs (Williams et al, 2011)

Fiona Williams, Daniel Milss and Kun Guo at the University of Lincoln have developed a head mounted eye tracking system for our four legged friends. Using a special construct based on a head strap and a muzzle the device was mounted on the head of the dog where a dichroic mirror placed in front of one of the eyes reflects the IR image back to the camera.


The device was adapted from a VisionTrack system by IScan/Polhemus and contains two miniature cameras, one for the eye and one for the scene which is connected to a host workstation. When used with human subject such setup provides 0.3 deg. of accuracy according to the manufacturer. Williams et al obtained an accuracy of 2-3 deg. from a single dog when using a special calibration method containing five points located on a cross which was mounted at the tip of the muzzle. Using positive reenforcement the dog was gradually trained to wear and fixate targets which I'm sure wasn't an easy task.


Abstract:
Growing interest in canine cognition and visual perception has promoted research into the allocation of visual attention during free-viewing tasks in the dog. The techniques currently available to study this (i.e. preferential looking) have, however, lacked spatial accuracy, permitting only gross judgements of the location of the dog’s point of gaze and are limited to a laboratory setting. Here we describe a mobile, head-mounted, video-based, eye-tracking system and a procedure for achieving standardised calibration allowing an output with accuracy of 2–3◦. The setup allows free movement of dogs; in addition the procedure does not involve extensive training skills, and is completely non-invasive. This apparatus has the potential to allow the study of gaze patterns in a variety of research applications and could enhance the study of areas such as canine vision, cognition and social interactions.

  • Fiona J. Williams, Daniel S. Mills, Kun Guo, Development of a head-mounted, eye-tracking system for dogs, Journal of Neuroscience Methods, Volume 194, Issue 2, 15 January 2011, Pages 259-265, ISSN 0165-0270, DOI: 10.1016/j.jneumeth.2010.10.022. (available from ScienceDirect)

Wednesday, April 20, 2011

Fraunhofer CMOS-OLED Headmounted display with integrated eye tracker

"The Fraunhofer IPMS works on the integration of sensors and microdisplays on CMOS backplane for several years now. For example the researchers have developed a bidirectional microdisplay, which could be used in Head-Mounted Displays (HMD) for gaze triggered augmented-reality (AR) aplications. The chips contain both an active OLED matrix and therein integrated photodetectors. The combination of both matrixes in one chip is an essential possibility for system integrators to design smaller, lightweight and portable systems with both functionalities." (Press release)
"Rigo Herold, PhD student at Fraunhofer IPMS and participant of the development team, declares: This unique device enables the design of a new generation of small AR-HMDs with advanced functionality. The OLED microdisplay based Eyetracking HMD enables the user on the one hand to overlay the view of the real world with virtual contents, for example to watch videos at jog. And on the other hand the user can select the next video triggered only by his gaze without using his hands." (Press release)

Sensor integrates both OLED display and CMOS imaging sensor. 

Rigo Herold will present the system at the SID 2011 exhibitor forum at May 17, 2011 4:00 p.m.: Eyecatcher: The Bi-Directional OLED Microdisplay with the following specs:
  • Monochrome 
  • Special Eyetracking-Algorithm for HMDs based on bidirectional microdisplays
  • Front brightness: > 1500 cd/m²

Poster was presented at ISSCC 2011 : Industry Demonstration Session (IDS). Click to enlarge

In addition there is a paper titled "Bidirectional OLED microdisplay: Combining display and image sensor functionality into a monolithic CMOS chip" published with the following abstract:. 

"Microdisplays based on organic light-emitting diodes (OLEDs) achieve high optical performance with excellent contrast ratio and large dynamic range at low power consumption. The direct light emission from the OLED enables small devices without additional backlight, making them suitable for mobile near-to-eye (NTE) applications such as viewfinders or head-mounted displays (HMD). In these applications the microdisplay acts typically as a purely unidirectional output device [1–3]. With the integration of an additional image sensor, the functionality of the microdisplay can be extended to a bidirectional optical input/output device. The major aim is the implementation of eye-tracking capabilities in see-through HMD applications to achieve gaze-based human-display-interaction." Available at IEEE Xplore

Wednesday, March 2, 2011

Accurate eye center localisation for low-cost eye tracking

Fabian Timm from the Lübeck University Institute for Neuro and Bioinformatics demonstrate a "novel approach for accurate localisation of the eye centres (pupil) in real time. In contrast to other approaches, we neither employ any kind of machine learning nor a model scheme - we just compute dot products! Our method computes very accurate estimations and can therefore be used in real world applications such as eye (gaze) tracking." Sounds great, any ideas on gaze estimation and accuracy?

Head-mounted eye-tracking application for driving

Nicolas Schneider have for his masters thesis modified the ITU Gaze Tracker for eye tracking in an automotive setting. It incorporates a scene camera and software that calibrates and integrates it in the platform. The project was carried out at Schepens Eye Research Institute at Harvard and there is a good chance it will be released open source. A fine piece of work and an awesome addition to the framework. We're impressed by the results. More info to follow, for now enjoy this video.



  • Nicolas Schneider, Peter Bex, Erhardt Barth, and Michael Dorr. 2011. An open-source low-cost eye-tracking system for portable real-time and offline tracking. In Proceedings of the 1st Conference on Novel Gaze-Controlled Applications (NGCA '11). ACM, New York, NY, USA, , Article 8 , 4 pages. (Full text: PDF Online)


Wednesday, February 16, 2011

A self-calibrating, camera-based eye tracker for the recording of rodent eye movements (Zoccolan et al, 2010)

Came across an interesting methods article in "Frontiers in Neuroscience" published in late November last  year which involves the development of a fully automated eye tracking system which is calibrated without requiring co-operation from the subject. This is done by fixing the location of the eye and moving the camera to establish a geometric model (also see Stahl et al, 2000, 2004). Apparently they attempted to use a commercial EyeLink II device first but found it not suitable for rodent eye tracking due to thresholding implementation, illumination conditions and failing corneal reflection tracking when the rodent was chewing. So the authors built their own solution using a Prosilica camera and a set of algorithms (depicted below). Read the paper for implementation details. I find it to be a wonderful piece of work, different  from human eye tracking for sure but still relevant and fascinating.

Schematic diagram of the eye-tracking system

 Illustration of the algorithm to track the eye’s pupil and corneal
reflection spot.

Eye coordinate system and measurements


Horizontal and vertical alignment of the eye with the center of
the camera’s sensor.


Abstract:

"Much of neurophysiology and vision science relies on careful measurement of a human or animal subject’s gaze direction. Video-based eye trackers have emerged as an especially popular option for gaze tracking, because they are easy to use and are completely non-invasive. However, video eye trackers typically require a calibration procedure in which the subject must look at a series of points at known gaze angles. While it is possible to rely on innate orienting behaviors for calibration in some non-human species, other species, such as rodents, do not reliably saccade to visual targets, making this form of calibration impossible. To overcome this problem, we developed a fully automated infrared video eye-tracking system that is able to quickly and accurately calibrate itself without requiring co-operation from the subject. This technique relies on the optical geometry of the cornea and uses computer-controlled motorized stages to rapidly estimate the geometry of the eye relative to the camera. The accuracy and precision of our system was carefully measured using an artificial eye, and its capability to monitor the gaze of rodents was verified by tracking spontaneous saccades and evoked oculomotor reflexes in head-fixed rats (in both cases, we obtained measurements that are consistent with those found in the literature). Overall, given its fully automated nature and its intrinsic robustness against operator errors, we believe that our eye-tracking system enhances the utility of existing approaches to gaze-tracking in rodents and represents a valid tool for rodent vision studies."


  • Zoccolan DF, Graham BJ, Cox DD (2010) A self-calibrating, camera-based eye tracker for the recording of rodent eye movements. Frontiers in Neuroscience Methods. doi:10.3389/fnins.2010.00193 [link]

Thursday, February 3, 2011

EyeTech Digital Systems

Arizona-based EyeTech Digital Systems offers several interesting eye trackers where the new V1 caught my attention with its extended track-box of 25 x 18 x 50cm. The rather large depth range is provided through a custom auto focus mechanism developed in cooperation with Brigham Young University Dept. of Mechanical Engineering. This makes the device particularly suitable for larger displays such as public displays/digital signage, still the I'd imaging the calibration procedure to remain, ideally you'd want to walk up and interact/collect data automatically without any wizards or intervention. In any case, a larger trackbox is always welcome and it certainly opens up new opportunities. EyeTechs V1 offers 20cm more than most.





Wednesday, February 2, 2011

Spring eye tracker in action

More videos of the Spring eye tracker is available at the company website.

Thursday, January 13, 2011

Taiwanese Utechzone, the Spring gaze interaction system

UTechZone a Taiwanese company have launched the Spring gaze interaction system for individuals with ALS or similar conditions. It provides the basic functionality including text entry, email, web, media etc. in a format that reminds much of the MyTobii software. The tracker can be mounted in various ways including wheelchairs and desks with the accessories. A nice feature is the built in TV tuner which is accessible through the gaze interface. The performance of the actual tracking system and accuracy in gaze estimation is unknown, only specified to a 7x4 grid. Track-box is specified to 17cm x 10cm x 15cm with a working range of 55-70 cm.

The system runs on Windows XP and a computer equipped with an Intel Dual Core CPU, 2GB RAM, a 500GB HD combined with a 17" monitor.
Supported languages are Traditional Chinese, Simplified Chinese, English and Japanese. All countries with pretty big markets. Price unknown but probably less than a Tobii. Get the product brochure (pdf). 



Wednesday, December 22, 2010

Santa's been spotted - Introducing the SMI Glasses

What a year it has been in the commercial eye tracking domain. In June we had the Tobii glasses which was their entry into the head-mounted market which created some buzz online. This was followed by a high-speed remote system, the Tobii TX300, which was introduced in November. Both products competed directly with the offering from SMI which countered with the RED500 remote tracker, surpassing the Tobii system by 200 samples per second. Today it's my pleasure to introduce the SMI Glasses which brings up the competition a couple of notches. Being comparable in the neat, unobtrusive form factor they provide binocular tracking with a direct view of both eyes.
Rendered image of the upcoming SMI Glasses.
The small scene camera is located in the center of glasses which gives minimal parallax. Although the hard specs has yet to be released it is rumored to have a high resolution scene camera, long battery lifetime and an advanced IR AOA marker detection system which enables automatic mapping of gaze data to real-world objects. Furthermore, they can be used not only as blackbox system – but may be integrated with SMIs current head mounted devices, including live view, open interface for co-registration etc. Estimated availability is projected to the first half of 2011.

Thanks for all the hard work, inspiration and feedback throughout 2010, it's been an amazing year. By the looks of it 2011 appears to be a really interesting year for eye tracking. I'd like to wish everyone a Merry Christmas and a Happy New Year.

Monday, November 15, 2010

SMI RED500

Just days after the Tobii TX300 was launched SMI counters with the introduction of the the worlds first 500Hz remote binocular eye tracker. SMI seriously ramps up the competition in the high speed remote systems, surpassing the Tobii TX by a hefty 200Hz. The RED500 has a operating distance of 60-80cm with a 40x40 trackbox at 70cm with a reported accuracy of <0.4 degrees under typical (optimal?) settings. Real-world performance evaluation by independent third party remains to be seen. Not resting on their laurels SMI regains the king-of-the-hill position with an impressive achievement that demonstrates how competitive the field has become. See the technical specs for more information.

Tuesday, August 17, 2010

How to build low cost eye tracking glasses for head mounted system (M. Kowalik, 2010)

Michał Kowalik of the Faculty of Computer Science and Information Technology at the West Pomeranian University of Technology in Szczecin, Poland, has put together a great DIY instruction for a headmounted system using the ITU Gaze Tracker. The camera of choice is the Microsoft LifeCam VX-1000 which has been modified by removing the casing and IR filter. In addition, three IR LEDs illuminate the eye using power from the USB cabel. This is then mounted on a pair of safety glasses, just like Jason Babcock & Jeff Pelz previously have done. Total cost of the hardware less than 50€. Neat. Thanks Michal.

Download instructions as PDF (8.1Mb)

    Tuesday, August 10, 2010

    Eye control for PTZ cameras in video surveillance

    Bartosz Kunka, a PhD student at the Gdańsk University of Technology have employed a remote gaze-tracking system called Cyber-Eye to control PTZ cameras in video surveillance and video-conference systems. The movie prepared for system presentation on Research Challange at SIGGRAPH 2010 in Los Angeles.

    Monday, May 24, 2010

    EyePhone - Mobil gaze interaction from University of Dartmouth

    From the Emiliano Miluzzo and the group at Sensorlab, part of the Computer Science department at University of Dartmouth, comes the EyePhone which enables rudimentary gaze based interaction for tablet computers. Contemporary devices often utilizes touch based interaction, this creates a problem with occlusion where the hands covers large parts of the display. EyePhone could help to alleviate this issue. The prototype system demonstrated offers enough accuracy for an interfaces based on a 3x3 grid layout but with better hardware and algorithms there is little reason why this couldn't be better. However, a major issue with a mobile system is just the mobility of both the user and the hardware, in practice this means that not only the individual head moments has to be compensated for but also movements of the camera in essentially all degrees of freedom. Not an easy thing to solve but it's not a question of "if" but "when". Perhaps there is something that could be done using the angular position sensors many mobile devices already have embedded. This is an excellent first step and with a thrilling potential. Additional information is available in the M.I.T Technology Review article.



    Abstract
    As smartphones evolve researchers are studying new techniques to ease the human-mobile interaction. We propose EyePhone, a novel "hands free" interfacing system capable of driving mobile applications/functions using only the user's eyes movement and actions (e.g., wink). EyePhone tracks the user's eye movement across the phone's display using the camera mounted on the front of the phone; more speci cally, machine learning algorithms are used to: i) track the eye and infer its position on the mobile phone display as a user views a particular application; and ii) detect eye blinks that emulate mouse clicks to activate the target application under view. We present a prototype implementation of EyePhone on a Nokia 810, which is capable of tracking the position of the eye on the display, mapping this positions to a function that is activated by a wink. At no time does the user have to physically touch the phone display.


    Figures. Camera images, eye region of interests and reported accuracies. Click to enlarge.

    • Emiliano Miluzzo, Tianyu Wang, Andrew T. Campbell, EyePhone: Activating Mobile Phones With Your Eyes. To appear in Proc. of The Second ACM SIGCOMM Workshop on Networking, Systems, and Applications on Mobile Handhelds (MobiHeld'10), New Delhi, India, August 30, 2010. [pdf] [video]

    Thursday, May 20, 2010

    Magnetic Eye Tracking Device from Arizona State University

    A group of students at the Arizona State University have revisited the scleral search coil to develop a new low-cost Magnetic Eye Tracking Device (METD). The entrepreneurs aim at making this technology available to the public at an affordable $4000 and are primarily targeting disabled. More information is available at ASU News.



    If your new to eye tracking it should be noted that the reporter claiming that common video based systems uses infrared lasers is just silly. It's essentially light-sources working in the IR spectrum (similar to the LED in your remote control).

    Friday, April 30, 2010

    GazePad: Low-cost remote webcam eye tracking

    Came across the GazeLib low-cost remote eye tracking project today which uses ordinary webcams without IR illumination. The accuracy is pretty low but it's really nice to see another low-cost approach for assistive technology.

    "GazeLib is a programming library which making real-time low-cost gaze tracking becomes possible. The library provide functions performing remote gaze tracking under ambient lighting condition using a single, low cost, off-the-shelf webcam. Developers can easily build gaze tracking technologies implemented applications in only few lines of code. GazeLib project focuses on promoting gaze tracking technology to consumer-grade human computer interfaces by reducing the price, emphasizing ease-of-use, increasing the extendibility, and enhancing the flexibility and mobility."