Tuesday, October 23, 2012

European Conference on Eye Movements 2013 announced


The European Conference on Eye Movements 2013 will be held in Lund, Sweden, from August 11th to 16th 2013. ECEM is the largest and oldest eye tracking conference in the world. The conference webpage is now public: http://ecem2013.eye-movements.org/

Next year’s conference will include four panel discussions, 9 keynote speakers and a large number of sessions of 4 to 6 talks. We also include pre-conference methods workshops taught by top experts in the field on diverse topics related to eye movements and eye tracking, open to all researchers at every level, and to members of industry, running from the 7th to 10th August. You can see a list of these topics and the teachers here: http://ecem2013.eye-movements.org/workshops.
Important dates include the following:
  • Oct 15, 2012: Submission of proposals and abstracts will open.
  •  Jan 15, 2013: Deadline for proposals for symposia.
  • Feb 25, 2013: Notification on acceptance for symposia.
  • March 1, 2013: Deadline for 2-page extended abstract for talks and 200 word abstracts for posters.
  • April 1, 2013: Registration opens.
  • April 15, 2013: Notification on acceptance for talks and posters.
  • May 1, 2013: Last day for reduced registration fee.
Organising committee
  • Conference Chairs: Kenneth Holmqvist and Arantxa Villanueva
  • Conference Organiser: Fiona Mulvey
  • Scientific Board: Halzska JarodzkaIgnace HoogeRudolf Groner and Päivi Majaranta
  • Exhibition Chairs: John Paulin Hansen and Richard Andersson
  • Method Workshop Organisers: Marcus Nyström and Dan Witzner Hansen
  • Web Masters: Nils Holmberg and Detlev Droege
  • Proceedings Editors: Roger Johansson and Richard Dewhurst
  • Registration Managers: Kerstin Gidlöf and Linnéa Larsson
  • Student Volunteer Managers: Linnéa LarssonRichard Dewhurst and Kerstin Gidlöf
  • Social Program Organisers: Richard AnderssonJana Holsanova and Kerstin Gidlöf
Contact
  • Conference chairs and organiser: management at/på/an ecem2013.eye-movements.org
  • Exhibition: exhibition at/på/an ecem2013.eye-movements.org
  • Method workshops: workshops at/på/an ecem2013.eye-movements.org
  • The web page: webmaster at/på/an ecem2013.eye-movements.org

Tuesday, October 2, 2012

Fujitsu tablet and monitor

Today the first public demonstrations of the Fujitsu/Docomo/Tobii tablet came online, all from the CEATEC 2012 expo in Japan. The prototype tablet, called iBeam, is designed by Fujitsu for Docomo and contains an eye tracking module from Swedish Tobii, namely the IS-20 which was introduced earlier this year. The form factor appears a bit on the large size with a bump towards the edge where the eye tracking module is placed, sort of looks like a tablet inside another case. On the software side the tablet is running Android where a gaze marker is overlaid on the interface. Selection is performed using simple dwell activation which is known for being both stressful and error-prone. The sample apps contains the usual suspects, panning of photos and maps, scrolling browser and image viewer. Pretty neat for a prototype.




Fujitsu also demonstrated a LCD monitor with an eye tracking camera system embedded while the actual gaze estimation algorithms are running on an embedded Windows computer. This display is not using the Tobii IS20 but a system developed by Fujitsu themselves which is stated to be low-cost. Question is why they didn't use this for the tablet. From what I can tell it does not provide the same level of accuracy, it appears to be a rough up/down, left/right type of gaze estimation which explains why the demo apps only handles panning of maps and images.

Panasonic in-flight eye control demo

From the APEX 2012 here's a video where Steve Sizelove from Panasonic demonstrates their eye and gesture control systems for future in-flight entertainment systems. Even if this is a futuristic concept it is clear that Panasonic is pushing the envelope on in-flight systems. Their X-series system is state-of-the-art, just take a look at the upcoming eX3, a touch-enabled Android platform with an associated app store, support for the Unity 3D engine, fast internet etc. Great stuff for those transatlantic flights that seem to take forever.

Thursday, August 16, 2012

Tough decisions, big plans and a bright future

Browsed through my blog today. Realized I hadn't written much about what I've been up to. There's been a reason for that. One year ago I left my position at Duke University. It wasn't an easy decision. The Radiology eye tracking project I was involved with (and still is) was making good progress. I had been working long days since it started at Stanford in 2009 and we were doing pretty neat stuff with volumetric medical image datasets. 

The Stanford/Duke Radiology eye tracking project and our novel approach to volumetric gaze data.

At the same time I spent nights and weekends working on the open source ITU Gaze Tracker together with Javier San Agustin. Somewhere I always had the feeling that we should get back together, great things just seemed to happen when we did. So after my grand tour of the US and countless Skype meetings over six months we had a plan. The four former PhD students from the ITU Gazegroup was to start an eye tracking company. At first we called it Senseye but later changed it to The Eye Tribe due to trademark issues. 

The Eye Tribe as of Spring 2012 at the US embassy reception. 

We decided early not to go for the established market. It's a red sea with a couple of fairly big players that have been working on their high tech creations for years, it's a low volume/high margin game with intricate and expensive solutions primarily for the research and disabled markets. 

The Eye Tribe intends to innovate and disrupt by bringing eye tracking to post-pc devices in the consumer market. It just doesn't happen with devices that costs several thousand dollars.  

After twelve months of executing our plan we recently raised funds from a group of European investors to accelerate (as covered by The Next Web). The team has grown and we are looking to make additional hires in a near future. Perhaps you would like join the tribe and be part of something great? There's some very interesting things happening in a near future, for the skilled it's always best to get on early.

One year ago I traded a warm North Carolina for a cold Copenhagen, a relationship for loneliness, a big house for a small apartment and a sport car for a bicycle. Time will tell if that was the right thing to do, with big plans, full commitment and funding in place, it is so far, so good.

Monday, July 30, 2012

What the mind can conceive, it can achieve.

Today marks a historic day as the ITU GazeGroup.org open source Gaze Tracker has been downloaded over 30,000 times. Although the current version was released in October 2010 we're still seeing approximately 1000 downloads per month. We're really happy to see how widely distributed it has become, reaching all corners of the planet. When we released the first version, back in 2009, we had no idea it would reach distant places such as Kyrgyzstan, Suriname or Burkina Faso. The objective was, and still is, to "democratize and provide access to eye tracking technology regardless of means or nationality". This milestone is an achievement that I'd like to thank everyone involved for.


Top 10 Countries
10. Brazil 720  
9. Denmark 803  
8. France 865  
7. China 888  
6. India 1,063  
5. Italy 1,170  
4. Japan 1,226  
3. United Kingdom 1,359  
2. Germany 2,266  
1. United States 4,647

Top 10 total: 15,007 (50%) Full stats.

Thursday, June 28, 2012

Dual scene-camera head-mounted eye tracking rig from Lancaster Uni.

From the Pervasive 2012 conference held last week in Newcastle comes a demo of a dual scene-camera head-mounted eye tracking rig that enables users to move objects between two displays using the gaze position. The larger display acts as the "public" display (digital signage etc.) while the smaller represents the personal handheld tablet/smartphone. Nifty idea from Jayson TurnerAndreas Bulling and Hans Gellersen, all from the Embedded Interactive Systems group at Lancaster University

Tuesday, June 19, 2012

The Eye Tribe presents worlds first eye controlled Windows 8 tablet

It slices, it dices! The Eye Tribe from Copenhagen introduces the worlds first Windows 8 eye tracking tablet. The small, lightweight add-on connects via USB, no additional cables or batteries needed. For the time being the specs are 30Hz, accuracy of 0.5 degrees and an exceptionally large tracking range. More info to follow.

 


The Eye Tribe, formerly known as Senseye, have made significant progress in recent months. In January they won the Danish Venture Cup. Then went on to participate in the Rice RBPC, the worlds premier business plan competition, made it to the semi-finals and was awarded "Most Disruptive Technology" while being mentioned in Fortune Magazine and Houston Chronicle. In May the team won the eHealth Innovation Contest followed by the audience award at the Danish Accelerace whereby they were selected to participate at the Tech All Stars event which gives the most promising European startups the opportunity to pitch at the LeWeb conference in London on June 20th.

Friday, June 8, 2012

Eyecatcher - A 3D prototype combining Eyetracking with a Gestural Camera

Eyecatcher is a prototype combining eyetracking with a gestural camera on a dual screen setup. Created for the Oilrig process industry, this project was a collaborative exploration between ABB Corporate Research and Interactive Institute Umeå (blog).


Sunday, June 3, 2012

Copenhagen Business School: PhD position available

Copenhagen Business School invites applications for a vacant PhD scholarship in empirical modeling of eye movements in reading, writing and translation. The PhD position is offered at the Department of International Business Communication at the Copenhagen Business School (CBS). The Department of International Business Communication is a new department at CBS whose fields of interest include the role of language(-s) in interlingual and intercultural communication, the role of language and culture competences in organizations, the role of language and culture in communication technology and social technologies, as well as the teaching of language skills. The Department is dedicated to interdisciplinary and problem-oriented research.

Considerable progress has been made in eye-tracking technology over the past decade, allowing to capture  gaze behavior with free head movements. However, the imprecision of the measured signal makes it difficult to analyze the eye-gaze movement in reading tasks where a precise local resolution of the gaze samples is required to track the reader's gaze path over a text. The PhD position will investigate methods to cancel out the noise from the gaze signal. The PhD candidate will investigate, design and implement empirically-based models of eye-gaze movements in reading which take into account physical properties of the visual system in addition to background information, such as the purpose of the reading activity, the structure of the text, the quality of the gaze signal, etc. The PhD candidate should have:
  • an interest in cognitive modeling of human reading, writing and translation processes
  • a basic understanding of browser and eye-tracking technology
  • knowledge of probability theory and statistical modeling
  • advanced programming skills
More information available here.

Friday, June 1, 2012

Temporal Control In the EyeHarp Gaze-Controlled Musical Interface


The EyeHarp that I wrote about last summer is a gaze controlled musical instrument build by Zacharias Vamvakousis. In the video below he demonstrates how the interface is driven by the ITU Gaze Tracker and used to compose a loop which then improvise upon. On the hardware side a modified PS3 camera is used in combination with two infrared light sources. This setup was presented in New Interfaces for Musical Expression (NIME 2012) conference in Detroit a week ago, while it will be exhibited in Sonar, Barcelona on 14-16, June 2012. Great to see that such innovative interface being made open source and combined with the ITU tracker.

  • Vamvakousis, Z. and Ramirez, R. (2012) Temporal Control In the EyeHarp Gaze-Controlled Musical Interface. In the proceedings on the 12th International Conference on New Interfaces for Musical Expression. 21-23 May 2012. Ann Arbor, Michigan, USA. (PDF)

Monday, April 23, 2012

Noise Challenges in Monomodal Gaze Interaction (Skovsgaard, 2011)


Henrik Skovsgaard of the ITU Gaze Group successfully defended his PhD thesis “Noise Challenges in Monomodal Gaze Interaction” at the IT University of Copenhagen on the 13th December 2011. The PhD thesis can be downloaded here.  

ABSTRACT
Modern graphical user interfaces (GUIs) are designed with able-bodied users in mind. Operating these interfaces can be impossible for some users who are unable to control the conventional mouse and keyboard. An eye tracking system offers possibilities for independent use and improved quality of life via dedicated interface tools especially tailored to the users’ needs (e.g., interaction, communication, e-mailing, web browsing and entertainment). Much effort has been put towards robustness, accuracy and precision of modern eye-tracking systems and there are many available on the market. Even though gaze tracking technologies have undergone dramatic improvements over the past years, the systems are still very imprecise. This thesis deals with current challenges of monomodal gaze interaction and aims at improving access to technology and interface control for users who are limited to the eyes only. Low-cost equipment in eye tracking contributes toward improved affordability but potentially at the cost of introducing more noise in the system due to the lower quality of hardware. This implies that methods of dealing with noise and creative approaches towards getting the best out of the data stream are most wanted. The work in this thesis presents three contributions that may advance the use of low-cost monomodal gaze tracking and research in the field:
  • An assessment of a low-cost open-source gaze tracker and two eye tracking systems through an accuracy and precision test and a performance evaluation. 
  • Development and evaluation of a novel innovative 3D typing system with high tolerance to noise that is based on continuous panning and zooming.
  • Development and evaluation of novel selection tools that compensate for noisy input during small-target selections in modern GUIs. 
This thesis may be of particular interest for those working on the use of eye trackers for gaze interaction and how to deal with reduced data quality. The work in this thesis is accompanied by several software applications developed for the research projects that can be freely downloaded from the eyeInteract appstore (http://www.eyeinteract.com).

SUPERVISORS

ASSESSMENT COMMITTEE 

Monday, March 12, 2012

SMI RED-M

Well, well, look here. A constellation of eye tracking manufacturers are joining in on the affordable market, perhaps defined some time ago by Mirametrix who launched at @ $5k. Tobii have their PC Eye, perfectly fine but at a cool $7k and is showcasing the new IS2 chipset but apparently can't do CEBIT12 demos. The new player is Sensomotoric Instruments, known for their high quality hardware and finely tuned algorithms. Their new contribution is the RED-M (M is for mini?). Even if the price hasn't been announced I would assume it's less than it's high speed fire-wire sibling, perhaps similar to the PCEye pricing?

The M-version is a small device made out of plastics that connects via USB 2.0 (assuming two plugs, one for power), it measures 240x25x33mm - that's pretty small and it's only 130 grams. This is a big difference from their prior models which have been very solid and made out of high quality materials and professional components. The accuracy is specified to 0.5deg, 50-75cm distance where the box is 320x210mm @ 60cm with a sample rate of 60/120Hz, in essence it's the low end version of the RED series where the top model is the super fast RED500 . Although it has yet to be demonstrated in operational state some material has appeared online. Below is the animated setup guide, you can find more information on their website. Looking good!

Monday, March 5, 2012

RealGaze Glasses

Just came across the RealGaze glasses which is being developed by Devon Greco et al. He's father was diagnosed with ALS some years ago and given that Devon has been tinkering with electronics since early on he set out to build an eye tracker. For a prototype the result looks good, I guess the form factor feels familiar. There isn't too much meat available at the moment other than big ambitions to manufacturer an affordable device. Most of us would love to see that happen!




Thursday, February 16, 2012

Eyewriter & Not Impossible Foundation

The Eyewriter project which helped Tony 'TemptOne' Quan to draw again was originally document by Mick Ebeling. This material has been incorporated into a documentary called "Getting up" and recently won the audience award at the Slamdance. Movie buff Christopher Campbell wrote a short review on his blog. Great job on raising awareness, hope you guys find funding to further develop the software.



Getting Up: The Tempt One Story Trailer




How to build an EyeWriter

Wednesday, February 15, 2012

Prelude for ETRA2012

The program for the Eye Tracking Research & Applications (ETRA'12) is out and contains several really interesting papers this year.

Two supplementary videos surfaced the other day and comes from the User Interface & Software Engineering group at the Otto-von-Guericke-Universität in Germany. In addition the authors, Sophie Stellmach and Raimund Dachselt, have a paper submitted for the ACM SIGCHI Conference on Human Factors in Computing Systems" (CHI'12). Abstracts and videos below.

Abstract I (ETRA)
Remote pan-and-zoom control for the exploration of large information spaces is of interest for various application areas, such as browsing through medical data in sterile environments or investigating geographic information systems on a distant display. In this context, considering a user's visual attention for pan-and-zoom operations could be of interest. In this paper, we investigate the potential of gaze-supported panning in combination with different zooming modalities: (1) a mouse scroll wheel, (2) tilting a handheld device, and (3) touch gestures on a smartphone. Thereby, it is possible to zoom in at a location a user currently looks at (i.e., gaze-directed pivot zoom). These techniques have been tested with Google Earth by ten participants in a user study. While participants were fastest with the already familiar mouse-only base condition, the user feedback indicates a particularly high potential of the gaze-supported pivot zooming in combination with a scroll wheel or touch gesture.


 
To be presented at the ETRA12.


Abstract II
Since eye gaze may serve as an efficient and natural input for steering in virtual 3D scenes, we investigate the design of eye gaze steering user interfaces (UIs) in this paper. We discuss design considerations and propose design alternatives based on two selected steering approaches differing in input condition (discrete vs. continuous) and velocity selection (constant vs. gradient-based). The proposed UIs have been iteratively advanced based on two user studies with twelve participants each. In particular, the combination of continuous and gradient-based input shows a high potential, because it allows for gradually changing the moving speed and direction depending on a user's point-of-regard. This has the advantage of reducing overshooting problems and dwell-time activations. We also investigate discrete constant input for which virtual buttons are toggled using gaze dwelling. As an alternative, we propose the Sticky Gaze Pointer as a more flexible way of discrete input.


To be presented at the ETRA12.


Abstract III (CHI)
While eye tracking has a high potential for fast selection tasks, it is often regarded as error-prone and unnatural, especially for gaze-only interaction. To improve on that, we propose gaze-supported interaction as a more natural and effective way combining a user's gaze with touch input from a handheld device. In particular, we contribute a set of novel and practical gaze-supported selection techniques for distant displays. Designed according to the principle gaze suggests, touch confirms they include an enhanced gaze-directed cursor, local zoom lenses and more elaborated techniques utilizing manual fine positioning of the cursor via touch. In a comprehensive user study with 24 participants, we investigated the potential of these techniques for different target sizes and distances. All novel techniques outperformed a simple gaze-directed cursor and showed individual advantages. In particular those techniques using touch for fine cursor adjustments (MAGIC touch) and for cycling through a list of possible close-to-gaze targets (MAGIC tab) demonstrated a high overall performance and usability.


 
To be presented at the CHI12.

Tuesday, January 10, 2012

EyeTech EyeOn

A video from EyeTech that features Michael who suffers from Thoracic Outlet Syndrome (TOS). Great little clip that shows what computer control without gaze-adapted interfaces comes down to. Luckily Michael can use voice recognition software for typing, text input using eye movements alone is a cumbersome process (source).


Monday, November 14, 2011

EyeDrone - Eye gaze controlled navigation



Demonstration for eye-gaze controlled navigation. The goal was to move the the point of gaze to the center of the screen, simply put "where you look is where it goes". This is the popular AR.Drone quadricopter controlled with this policy using an EyeLink 2000 eye-tracker. The drone pivots left and right when the operator looks into those directions. It tilts up and down (making it fly backwards and forwards) when the operator looks up or down. EyeDrone was implemented by Lucas C Parra in C++ with much help from Michael Quintian. The chin rest here is not really needed as the basic control algorithm is inherently stable and miscalibrations are of little concern. 

Learn more at:http://bme.ccny.cuny.edu/faculty/lparra/eyeNavigate/index.html
Explore the Neural Engineering Group at CCNY - http://neuralengr.com

Wednesday, November 9, 2011

Low Cost Eye Tracking for Commercial Gaming: eyeAsteroids and eyeShoot in the Dark!

Latest update from Stephen Vickers and Howell Istance at the Centre for Computational Intelligence, De Montfort University who have been doing research and development of gaze-based gaming for several years now. Their latest project has been shortlisted in the consumer category of The Enginner Awards 2011. Congratulations on an awesome job!

"Using your eyes and where you are looking to interact with computer games represents an exciting new direction that game play can take, following the success of whole body interaction enabled by the Kinect and the Wii. The Innovation Fellowship has supported the development and demonstration of a low-cost eye tracker by De Montfort University, in collaboration with Sleepy Dog, the East Midlands games company that produced the Buzz-it controller and games. The low-cost eye tracker utilised the ITU Gazetracking library and was produced as a fully working pre-production prototype. In the project, three different games were produced to demonstrate different ways in which eye gaze can be used to make game play more immersive and exciting.

This video demostrates two of them.
  • eyeAsteroids The ship flies towards where you are looking and the space bar is used to fire.
  • eyeShoot in the Dark! The torch shines at where you are looking and the mouse is used to move the cross-hair and fire. 

Monday, September 19, 2011

SMI Glasses now available

The SMI Glasses, which we got a sneak-peak at before last Christmas and specs in April are now available for sale. SMI engineers have managed to squeeze two 30Hz eye tracking cameras and a high definition scene camera into the frame weighting only 75 grams. The advantage of tracking both eyes is that it enables parallax compensation where the accuracy is maintained regardless if you look close or far in the scene. In addition, binocular tracking most often yields better accuracy as it provides the opportunity to average both samples or to discard one with low validity. The HD scene camera captures 24 frames per second at 1280x960 pixels, a clear advantage. All in all, better late than never, SMI is back with a highly competitive product in the market. 

Wednesday, August 31, 2011

ETRA 2012 - Call for papers

"The Seventh ACM Symposium on Eye Tracking Research & Applications (ETRA 2012) will be held in Santa Barbara, California on March 28th-30th, 2012. The ETRA conference series focuses on all aspects of eye movement research and applications across a wide range of disciplines.  The symposium presents research that advances the state-of-the-art in these areas, leading to new capabilities in gaze tracking systems, gaze aware applications, gaze based interaction, eye movement data analysis, etc. For ETRA 2012, we invite papers in all areas of eye tracking research and applications."

ETRA 2012 THEME: MOBILE EYE TRACKING
     Mobile devices are becoming more powerful every day. Embedding eye tracking and gaze-based applications in mobile devices raises new challenges and opportunities to many aspects of eye tracking research. ETRA 2012 invites papers tackling the challenges, or exploring new research opportunities of mobile eye tracking.

GENERAL AREAS OF INTEREST
Eye Tracking Technology
    Advances in eye tracking hardware, software and algorithms such as: 2D and 3D eye tracking systems, calibration, low cost eye tracking, natural light eye tracking, predictive models, etc.

Eye Tracking Data Analysis
    Methods, procedures and analysis tools for processing raw gaze data as well as fixations and gaze patterns. Example topics are: scan path analysis, fixation detection algorithms, and visualization techniques of gaze data.

Visual Attention and Eye Movement Control
       Applied and experimental studies investigating visual attention and eye movements to gain insight in eye movement control, cognition and attention, or for design evaluation of visual stimuli. Examples are: usability and web studies using eye tracking, and eye movement behavior in everyday activities such as driving and reading.

Eye Tracking Applications
    Eye tracking as a human-computer input method, either as a replacement to traditional input methods or as a complement. Examples are: assistive technologies, gaze enhanced interaction and interfaces, multimodal interaction, gaze in augmented and mixed reality systems, and gaze-contingent displays.

SUBMISSIONS
Authors are invited to submit original work in the formats of Full paper (8 pages) and Short paper (4 pages). The papers will undergo a rigorous review process assessing the originality and quality of the work as well as the relevance for eye tracking research and applications.  Papers presented at ETRA 2012 will be available in the ACM digital library.  Submission formats and instructions are available at the conference web site.
IMPORTANT DATES
  Oct. 07th, 2011 Full Paper abstracts submission due
  Oct. 14th, 2011 Full Papers submission due 
  Nov. 21st, 2011  Full Papers acceptance notification 
  Dec. 07th, 2011 Short Papers submission due
  Jan. 16th, 2012 Short Papers acceptance notification
  Jan. 23rd, 2012 Camera ready papers due
 
CONFERENCE VENUE
ETRA 2012 will be held at the gorgeous Doubletree Resort in Santa Barbara, California, a 24-acre, mission-style resort hotel facing the Pacific Ocean, located on one of Southern California’s most beautiful coastlines.

SPONSORSHIP
ETRA 2012 is co-sponsored by the ACM Special Interest Group in Computer-Human Interaction (SIGCHI), and the ACM Special Interest Group on Computer Graphics and Interactive Techniques (SIGGRAPH). 

CONFERENCE CHAIRS
 Carlos Hitoshi Morimoto - University of São Paulo, Brazil
 Howell Istance - De Montfort University, UK

PROGRAM CHAIRS
Jeffrey B. Mulligan - NASA, USA
Pernilla Qvarfordt - FX Palo Alto, USA

PROGRAM AREA CHAIRS
       Andrew Duchowski, Clemson University, USA
       Päivi Majaranta, University of Tampere, Finland
       Joe Goldberg, Oracle, USA
       Shu-Chieh Wu, NASA Ames Research Center, USA
       Qiang Ji, Rensselaer Polytechnic Institute, USA
       Jeff Pelz, Rochester Institute of Technology, USA
       Moshe Eizenman, University of Toronto, USA

Tuesday, August 9, 2011

Job opening at Duke University: Software developer with research focus


Under the direction of the Chair of the Department of Radiology at Duke University and an international research group, the Software Engineer is responsible for developing and maintaining a research platform used to study visual perception. This research project utilizes eye trackers to capture gaze paths as the Radiologists search through medical image data sets. Your role involves rapid software development; as such you should be comfortable with swiftly assembling software on short iterations without formal requirements.

Tasks and Activities:

  • Rapid software development to support research activities without formal requirements.
  • Analyze design and architectural issues, and adjust existing system design and procedures to solve problems in a dynamic environment.
  • Solving a wide range of problems ranging from user interface design to more complex architectural design without supervision.
  • Readily accept responsibility and demonstrate ability to work independently.
  • Responsible for designing, developing, implementing, testing and maintaining software.
  • Regularly communicate project progress, issues, and risks to project manager.
  • Organizing datacollection with human subjects using eye tracking devices and the developed software platform.
Qualifications:

Required:
  • Strong technical knowledge and experience in the development, implementation and maintenance of an information system.
  • Aptitude to learn and understand change in software development process, procedures and methodologies.
  • Demonstrated experience with scientific methodology, academic writing and basic statistical analysis.
  • 5+ years of software development experience in object-orient programming technologies.
  • 2+ years of experience with Microsoft .Net C# and Windows Presentation Foundation.
  • Detailed technical knowledge and experience in use of data structures and network programming using TCP/IP and UDP.
Desired:
  • Experience with eye track systems or other forms of video-based tracking systems.
  • Experience from software engineering in a research setting.
  • Experience with the DICOM Medical Image format and visualization.
  • Experience designing or implementing algorithms for data analysis and image processing.
Education
MS+ in Computer Science or equivalent

Please visit this page to apply for the opening.

Wednesday, July 13, 2011

LG introduces the world first Glasses-Free 3D monitor with eye-tracking technology

Today LG announced a 20" LCD display with built-in "eye tracking" technology that enables glasses-free 3D imaging which moves this technology closer to the consumer market. The image below does, as far as I can tell, not reveal any infrared illuminators, a requirement for all known systems with high accuracy so it's probably more of a rough estimation system than a full-blown remote system. Best known accuracy (published research) under natural light is about 3-4 degrees of angle, with their financial resources they could potentially achieve better results. 
Left. The "special" eye tracking camera sensor. Looks like a rather typical webcam CMOS sensor to me. Unless they are doing some magic it will not allow accurate gaze estimation. Regardless, makes me wonder if 3D displays is the path by which eye tracking goes mainstream? Is this related to the collaboration between Seeing Machines and SuperD announced earlier this year or just a competing solution? Details are sparse, I'll keep you posted as it becomes available. 


Official press release:


SEOUL, July, 13, 2011 – LG Electronics (LG) today unveiled the world’s first glasses-free monitor utilizing eye-tracking technology to maintain an optimal 3D image from a range of viewing angles. The 20-inch D2000 (Korean model: DX2000) monitor was developed as a fully functional entertainment display capable of reproducing games, movies and images in all their realistic glory.

“With a full line-up of 3D TVs, laptops, projectors and smartphones, LG Electronics is by far and away the industry leader in all things 3D.” said Si-hwan Park, Vice President of the Monitor Division at LG’s Home Entertainment Company. “LG’s position has always been that 3D will and must eventually function without glasses. The D2000 is a look at what the future has in store.”

The D2000’s 3D effect comes courtesy of glasses-free parallax barrier 3D technology, and the application of the world’s first eye-tracking feature to the monitor. The combination of parallax barrier and eye-tracking in a single unit promises to open up new horizons for glasses-free 3D products.


Existing glasses-free 3D technologies generally require viewers to stay within a tightly restricted angle and distance to perceive the 3D images. However, the D2000 has done much to resolve this issue, allowing viewer much freer movement and more comfortable viewing. Eye tracking in the D2000 works via a special camera sensor attached to the monitor which detects changes in the user’s eye position in real-time. With this information, the monitor calculates the angle and position of the viewer and adjusts the displayed image for the optimal 3D effect.

In addition to playing back existing 3D content, the D2000 has a highly refined 2D to 3D conversion feature which adds a new dimension to existing movies and game playing.

The D2000, available in Korea this month, will be introduced in other markets around the world in the latter part of 2011.

Friday, July 8, 2011

This video demonstrates the use of Ergoneers Dikablis Eye-Control Module used to interact with a standard LCD TV. The project was carried out in collaboration with Technical University of Munich and reminds me much of an ongoing research project at ITU Copenhagen.


Gliding and Saccadic Gaze Gesture Recognition in Real Time (Rozado, 2011)

David Rozado with the Department of Neural Computation at the Universidad Autonoma de Madrid have developed a neural network approach for detecting gaze gestures in real time. I met David at ITU Copenhagen last summer when he was visiting and discussed this research, I'm happy to see that it came out with such great results. This research was part of Davids Ph.D thesis which focused on Hierarchical Temporal Memory (HTM) neural network which is a bioinspired pattern recognition algorithm. Using a low cost webcam and the ITU Gaze Tracker he is able to recognize ten different gestures with 90% accuracy using raw data. When a fixation detection algorithm and dwell time triggers are employed it is possible to achieve 100% detection rates (at the expense of longer activation times). 



Friday, July 1, 2011

Eyetrax webcam eye tracker from Carnegie Mellon

"Eyetrax is dynamic eye tracking software that uses a simple stationary web camera to detect eye movement. It can be used as a motionless computer interface and is especially useful when working with ALS patients. Additionally, the non-obtrusive nature of the program allows it to work perfectly to discretely generate hotspot maps for marketing purposes". The system is developed by Joseph Fernandez, Skylar Roebuck and Jonathon Smereka and was demonstrated at the Multimedia Computing Demos on May 3rd at Carnegie Mellon.

Utechzone demos

Recently Taiwanese Utechzone demonstrated a little game at Taipei Computex 2011.



Utechzone also demonstrated a driver fatigue detection system which is housed in a smaller formfactor. This system tracks the eye (open/closed) but doesn't perform gaze estimation. The video also shows the underlying gaze tracking system used in their Spring system which appears to have some issues with glasses.


Fast forward to 1 minute in

Wednesday, June 29, 2011

Mobile gaze-based screen interaction in 3D environments (D. Mardanbeigi, 2011)

Diako Mardanbeigi presents a method that enables the user to interact with any planar digital display in a 3D environment using a head-mounted eye tracker. An effective method for identifying the screens in the field of view of the user is also presented which can be applied in a general scenario in which multiple users can interact with multiple screens. Diakos PhD project at ITU Copenhagen concerns mobile gaze-based interaction. 





Related publication: 
  • Diako Mardanbegi and Dan Witzner Hansen. 2011. Mobile gaze-based screen interaction in 3D environments. In Proceedings of the 1st Conference on Novel Gaze-Controlled Applications(NGCA '11). ACM, New York, NY, USA, , Article 2 , 4 pages. PDF/Online

Monday, June 27, 2011

Setscan EyeLock - Law enforcement training system

Setscan, a Canadian supplier of training equipment for law enforcement and military have partnered with Arrington Research to develop a binocular headmounted system with associated software called Eye Lock. The system aims at evaluating and optimizing officers allocation of visual attention. Looking at the right thing is obviously important as milliseconds count when guns are drawn. The eye tracking system is the same as those used for any natural-scene perception research but the market adaptation and focus to meet the needs of a specific domain is interesting.

UCSF using eye tracking to detect early stages of neurodegeneration

Sabes Lab at University of California, San Francisco are using high speed eye tracking systems to study eye movements as a tool for detecting neurodegenerative diseases. The data collected including response time, fixation accuracy and saccade velocity. These are important parameters that could identify approaching or existing neurodegenerative conditions such as Alzheimer. This area holds a great market potential and is feasible in a near future as the remote systems are coming closer to meeting the requirements of tracker speed and accuracy.

Tuesday, June 21, 2011

The EyeHarp: An Eye Tracking Based Musical Instrument



The main goal of the 
Zacharias Vamvakousis EyeHarp project is to allow people with paralysis resulting from Amyotrophic Lateral Sclerosis to play music using only their eyes. To build this, Zacharias was inspired by the EyeWriter open source initiative: "...a low-cost eye-tracking apparatus & custom software that allows graffiti writers and artists with paralysis resulting from Amyotrophic lateral sclerosis to draw using only their eyes". Zacharias spent only 50 euros to build his eye tracker using a modified version of the Sony PS3 eye camera. The application is implemented in openframeworks v0.6. 


Alternatively, the instrument can be controlled using the mouse pointer (MouseHarp version). Then the free software camera mouse can be used to control the instrument with head movements. Any technology that can take control of the mouse pointer can be used in order to control the instrument. That way the mouseHarp could be an appropriate instrument for many cases of people with physical disabilities. The mouseHarp version is completely independent from the eyeWriter project. Combining the mouseHarp source with the source of the eyeWriter project, we get the eyeHarp! A low-cost gaze controlled musical instrument! Both versions are free and open source.

The EyeHarp project is part of Zacharias 
master thesis in Sound And Music Computing in UPF, Barcelona. His supervisor is Rafael Ramirez.


A paper on the application has been published:

  • Vamvakousis Z., Ramirez R. (2011) The Eyeharp: Aa Eye-Tracking-based Musical Instrument. SMC Conference 2011, Padova, Italy (PDF

Tuesday, June 7, 2011

Grinbath's EyeGuide

Texas based Grinbath recently announced the EyeGuide head mounted tracker. It's main competitive advantage is the low cost $1495, academic discounts are available ($1,179). The device captures eye images using a wireless camera, running on three AAA batteries, and streams these to a computer for processing. The package includes basic software for analysis and visualization.  See the whitepaper for more information. 

Monday, June 6, 2011

Proceedings from Novel Gaze-Controlled Applications 2011 online

The proceedings from the Novel Gaze-Controlled Applications 2011 conference are now available online. The conference that took place at the Blekinge Institute of Technology in Sweden during May 26-27 presented 11 talks covering a wide range of topics from gaming and gaze interaction to eye tracking solutions. Unfortunately I was unable to attend but luckily I'll have a couple of days interesting reading ahead. Kudos to Veronica Sundstedt and Charlotte Sennersten for organizing the event.
  • Gaze and voice controlled drawing
    Jan van der Kamp, Veronica Sundstedt
    Full text: PDF Online

  • Eye tracking within the packaging design workflow: interaction with physical and virtual shelves
    Chip Tonkin, Andrew D. Ouzts, Andrew T. Duchowski
    Full text: PDF Online

Monday, May 9, 2011

"Read my Eyes" - A presentation of the ITU Gaze Tracker

During the last month the guys at IT University of Copenhagen has been involved in the making of a video that's intended to introduce the ITU Gaze Tracker, an open source eye tracker, to a wider audience. The production has been carried out in collaboration with the Communication Department at the university and  features members of the group, students of the HCI class and Birger Bergmann Jeppesen who has had ALS since 1996. Many thanks to all involved, especially Birger & co for taking interest and participating in evaluation of the system.