tag:blogger.com,1999:blog-56511105114787058542024-02-01T21:09:20.473-08:00Martin Tall On Gaze Interactiona blog on research and developments in eye tracking and gaze interactionMartin Tallhttp://www.blogger.com/profile/16023736349313969571noreply@blogger.comBlogger250125tag:blogger.com,1999:blog-5651110511478705854.post-75175767762886727972013-09-05T15:53:00.002-07:002023-01-08T16:28:47.449-08:00Introducing The Eye Tribe TrackerIt's with great pride I today introduce the Eye Tribe Tracker. It's the worlds smallest remote tracker, the first to use USB3.0 and the only one below $100. It's not targeting the research community, instead it aims for new grounds being developers of next-gen gaze interaction applications. I will let the academic crowd determine if it meets their requirements. I'm too biased to claim that it's better than this or that. The only way to properly evaluate eye trackers is through standardized evaluation carried out by independent parties.<br />
<br />
<center>
<iframe width="560" height="315" src="https://www.youtube.com/embed/GNkuFwQqWMA" title="YouTube video player" frameborder="0" allowfullscreen></iframe>
</center>
<br />
On a personal level today marks an important milestone. I built my first gaze interaction software back in 2008, titled <a href="http://gazeinteraction.blogspot.dk/2008/06/video-demonstration-online.html">Neovisus</a>, as the outcome of my MSc. at Lund University. During this work I realized that gaze interaction could be a natural interaction element, not just for a specific user group but for everyone. At the time eye trackers were unfortunately really hard to come by, the one I used costs $25,000 (and still does). Javier San Agustin and myself attempted to fix this during our R&D of the ITU GazeTracker, an open source eye tracker software. In many ways we succeeded, but it lacked critical features; you had to order components to assembly your own rig, it was difficult to setup and tracking was far from robust compared to commercial alternatives.<br />
<br />
Overall, the ITU GazeTracker was a great learning experience, it evolved to become most distributed open source eye tracking software and gathered an active community. At the same time, we learned what it would take to build something great. It would require us to focus and make a full time commitment.<br />
<br />
Here we are two years later. With the launch of an truly affordable eye tracker we have taken a big step towards realizing the vision we are burning for. No longer is there a prohibiting barrier preventing developers from exploring the many benefits eye tracking can bring to their applications.<br />
<br />
Best of all, this is still the beginning. I can't wait to get this into the hands of all the developers who placed a $99 bet on the future.<br />
<br />
Tech specs (preliminary)<br />
<br />
<table style="background-color: #f1f1f1; border-collapse: collapse; border-spacing: 0px; box-sizing: border-box; color: #606060; font-family: 'Lucida Sans Unicode', 'Lucida Grande', sans-serif; font-size: 13px; line-height: 12px; margin: 0px auto 40px; width: 576px;"><tbody style="box-sizing: border-box;">
<tr style="box-sizing: border-box;"><td style="border: 1px solid rgb(17, 17, 17); box-sizing: border-box; color: #333333; direction: ltr; line-height: 18px; margin: 0px; padding: 9px 10px; vertical-align: top;"><span style="font-size: x-small;">Sampling rate</span></td><td style="border: 1px solid rgb(17, 17, 17); box-sizing: border-box; color: #333333; direction: ltr; line-height: 18px; margin: 0px; padding: 9px 10px; vertical-align: top;"><span style="font-size: x-small;">40Hz and 60Hz mode</span></td></tr>
<tr style="box-sizing: border-box;"><td style="border: 1px solid rgb(17, 17, 17); box-sizing: border-box; color: #333333; direction: ltr; line-height: 18px; margin: 0px; padding: 9px 10px; vertical-align: top;"><span style="font-size: x-small;">Accuracy</span></td><td style="border: 1px solid rgb(17, 17, 17); box-sizing: border-box; color: #333333; direction: ltr; line-height: 18px; margin: 0px; padding: 9px 10px; vertical-align: top;"><span style="font-size: x-small;">0.5° (average)</span></td></tr>
<tr style="box-sizing: border-box;"><td style="border: 1px solid rgb(17, 17, 17); box-sizing: border-box; color: #333333; direction: ltr; line-height: 18px; margin: 0px; padding: 9px 10px; vertical-align: top;"><span style="font-size: x-small;">Spatial Resolution</span></td><td style="border: 1px solid rgb(17, 17, 17); box-sizing: border-box; color: #333333; direction: ltr; line-height: 18px; margin: 0px; padding: 9px 10px; vertical-align: top;"><span style="font-size: x-small;">0.1° (RMS)</span></td></tr>
<tr style="box-sizing: border-box;"><td style="border: 1px solid rgb(17, 17, 17); box-sizing: border-box; color: #333333; direction: ltr; line-height: 18px; margin: 0px; padding: 9px 10px; vertical-align: top;"><span style="font-size: x-small;">Latency</span></td><td style="border: 1px solid rgb(17, 17, 17); box-sizing: border-box; color: #333333; direction: ltr; line-height: 18px; margin: 0px; padding: 9px 10px; vertical-align: top;"><span style="font-size: x-small;"><20ms at 60Hz</span></td></tr>
<tr style="box-sizing: border-box;"><td style="border: 1px solid rgb(17, 17, 17); box-sizing: border-box; color: #333333; direction: ltr; line-height: 18px; margin: 0px; padding: 9px 10px; vertical-align: top;"><span style="font-size: x-small;">Calibration</span></td><td style="border: 1px solid rgb(17, 17, 17); box-sizing: border-box; color: #333333; direction: ltr; line-height: 18px; margin: 0px; padding: 9px 10px; vertical-align: top;"><span style="font-size: x-small;">5, 9, 12 points</span></td></tr>
<tr style="box-sizing: border-box;"><td style="border: 1px solid rgb(17, 17, 17); box-sizing: border-box; color: #333333; direction: ltr; line-height: 18px; margin: 0px; padding: 9px 10px; vertical-align: top;"><span style="font-size: x-small;">Operating range</span></td><td style="border: 1px solid rgb(17, 17, 17); box-sizing: border-box; color: #333333; direction: ltr; line-height: 18px; margin: 0px; padding: 9px 10px; vertical-align: top;"><span style="font-size: x-small;">45cm – 75cm</span></td></tr>
<tr style="box-sizing: border-box;"><td style="border: 1px solid rgb(17, 17, 17); box-sizing: border-box; color: #333333; direction: ltr; line-height: 18px; margin: 0px; padding: 9px 10px; vertical-align: top;"><span style="font-size: x-small;">Tracking area</span></td><td style="border: 1px solid rgb(17, 17, 17); box-sizing: border-box; color: #333333; direction: ltr; line-height: 18px; margin: 0px; padding: 9px 10px; vertical-align: top;"><span style="font-size: x-small;">40cm x 40cm at 65cm distance</span></td></tr>
<tr style="box-sizing: border-box;"><td style="border: 1px solid rgb(17, 17, 17); box-sizing: border-box; color: #333333; direction: ltr; line-height: 18px; margin: 0px; padding: 9px 10px; vertical-align: top;"><span style="font-size: x-small;">Screen sizes</span></td><td style="border: 1px solid rgb(17, 17, 17); box-sizing: border-box; color: #333333; direction: ltr; line-height: 18px; margin: 0px; padding: 9px 10px; vertical-align: top;"><span style="font-size: x-small;">Up to 24”</span></td></tr>
<tr style="box-sizing: border-box;"><td style="border: 1px solid rgb(17, 17, 17); box-sizing: border-box; color: #333333; direction: ltr; line-height: 18px; margin: 0px; padding: 9px 10px; vertical-align: top;"><span style="font-size: x-small;">API/SDK</span></td><td style="border: 1px solid rgb(17, 17, 17); box-sizing: border-box; color: #333333; direction: ltr; line-height: 18px; margin: 0px; padding: 9px 10px; vertical-align: top;"><span style="font-size: x-small;">C++, C# and Java included</span></td></tr>
<tr style="box-sizing: border-box;"><td style="border: 1px solid rgb(17, 17, 17); box-sizing: border-box; color: #333333; direction: ltr; line-height: 18px; margin: 0px; padding: 9px 10px; vertical-align: top;"><span style="font-size: x-small;">Data output</span></td><td style="border: 1px solid rgb(17, 17, 17); box-sizing: border-box; color: #333333; direction: ltr; line-height: 18px; margin: 0px; padding: 9px 10px; vertical-align: top;"><span style="font-size: x-small;">Binocular gaze data</span></td></tr>
<tr style="box-sizing: border-box;"><td style="border: 1px solid rgb(17, 17, 17); box-sizing: border-box; color: #333333; direction: ltr; line-height: 18px; margin: 0px; padding: 9px 10px; vertical-align: top;"><span style="font-size: x-small;">Dimensions (W/H/D)</span></td><td style="border: 1px solid rgb(17, 17, 17); box-sizing: border-box; color: #333333; direction: ltr; line-height: 18px; margin: 0px; padding: 9px 10px; vertical-align: top;"><span style="font-size: x-small;">20 x 1.9 x 1.6 cm (7.9 x 0.75 x 0.66 inches)</span></td></tr>
<tr style="box-sizing: border-box;"><td style="border: 1px solid rgb(17, 17, 17); box-sizing: border-box; color: #333333; direction: ltr; line-height: 18px; margin: 0px; padding: 9px 10px; vertical-align: top;"><span style="font-size: x-small;">Weight</span></td><td style="border: 1px solid rgb(17, 17, 17); box-sizing: border-box; color: #333333; direction: ltr; line-height: 18px; margin: 0px; padding: 9px 10px; vertical-align: top;"><span style="font-size: x-small;">130g</span></td></tr>
<tr style="box-sizing: border-box;"><td style="border: 1px solid rgb(17, 17, 17); box-sizing: border-box; color: #333333; direction: ltr; line-height: 18px; margin: 0px; padding: 9px 10px; vertical-align: top;"><span style="font-size: x-small;">Connection</span></td><td style="border: 1px solid rgb(17, 17, 17); box-sizing: border-box; color: #333333; direction: ltr; line-height: 18px; margin: 0px; padding: 9px 10px; vertical-align: top;"><span style="font-size: x-small;">USB3.0 Superspeed</span></td></tr>
</tbody></table>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj6O1HDI0UTGadwzdzKQg02LZr9lsUu7ioGquLsKMGEc2N0lP8Ps2N2lvgn1yXMArc4K-Y7dymSYSiLtq9_MLCwFysyHk5qVzODleBDGpqdgtlsuiZ2mU5ceDrW7OrKg2-xj1VdYnbv7AM/s1600/tracker_tripod2_small.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="96" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj6O1HDI0UTGadwzdzKQg02LZr9lsUu7ioGquLsKMGEc2N0lP8Ps2N2lvgn1yXMArc4K-Y7dymSYSiLtq9_MLCwFysyHk5qVzODleBDGpqdgtlsuiZ2mU5ceDrW7OrKg2-xj1VdYnbv7AM/s400/tracker_tripod2_small.png" width="400" /></a></div>
<br />
<br />Martin Tallhttp://www.blogger.com/profile/16023736349313969571noreply@blogger.com6tag:blogger.com,1999:blog-5651110511478705854.post-69454843104917580602013-07-30T09:15:00.001-07:002018-05-09T15:28:34.534-07:00Duke University tracking peacocks<br />
<center>
<iframe allowfullscreen="" frameborder="0" height="480" src="//www.youtube.com/embed/rsJFzdimVfo" width="640"></iframe>
</center>
<br />
<ul>
<li><span style="background-color: white; color: #5b5b5b; font-family: "arial" , "helvetica" , "nimbus sans l" , sans-serif; font-size: 14px; line-height: 19.59375px;">Through their eyes: selective attention in peahens during courtship," Jessica Yorzinski, Gail Patricelli, Jason Babcock, John Pearson, Michael Platt. Journal of Experimental Biology, July 24, 2013.</span></li>
</ul>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEix2Kk5_WFCg4mg6-yS9_Ms-zqBq-Ntz5UyzO3F8sAwWvIxW7guavZEN-GOLqjxwgErtXMlVIM2gxxzm6nTiZ2wUguQx6uTbGSsdGb_N8E6S7nKMP4cOsedTtXYkfHd4bI-_dNk3KaBAaE/s1600/Robo_Hen_72.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="286" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEix2Kk5_WFCg4mg6-yS9_Ms-zqBq-Ntz5UyzO3F8sAwWvIxW7guavZEN-GOLqjxwgErtXMlVIM2gxxzm6nTiZ2wUguQx6uTbGSsdGb_N8E6S7nKMP4cOsedTtXYkfHd4bI-_dNk3KaBAaE/s400/Robo_Hen_72.jpg" width="400" /></a></div>
<div>
<span style="color: #5b5b5b; font-family: "arial" , "helvetica" , "nimbus sans l" , sans-serif;"><span style="font-size: 14px; line-height: 19.59375px;"><br /></span></span></div>
Martin Tallhttp://www.blogger.com/profile/16023736349313969571noreply@blogger.com0tag:blogger.com,1999:blog-5651110511478705854.post-81245701280390875232013-03-18T16:00:00.001-07:002013-03-18T16:00:31.014-07:00ITU 2xPhD positions: Eye Tracking for mobile devices<br />
<div style="color: #555555; font-size: 13px; line-height: 19.1875px; margin-bottom: 1.5em;">
<span style="font-family: inherit;">From the IT University of Copenhagen (<a href="http://www.gazegroup.org/">GazeGroup</a> and <a href="http://www.itu.dk/research/eye/">EyeInfo</a>) comes an offer for two fully funded PhD positions on the topic of eye tracking for mobile devices. This is an excellent opportunity to work together with domain expertise on the development of the next generation eye tracking systems. </span></div>
<div style="color: #555555; font-size: 13px; line-height: 19.1875px; margin-bottom: 1.5em;">
<span style="font-family: inherit;">The aims of the project are to develop eye tracking algorithms and hardware for mobile devices, applying eye movement signals to games, toys, device interaction, augmented reality, and combining these signals with existing sensors in mobile devices, like GPS, gyroscope, accelerometer, and/or brain activity measured by EEG electroencephalography.</span></div>
<div style="color: #555555; font-size: 13px; line-height: 19.1875px; margin-bottom: 1.5em;">
<span style="font-family: inherit;">We are seeking <b>two</b> excellent PhD students to do research within one of two areas:</span></div>
<div style="color: #555555; font-size: 13px; line-height: 19.1875px; margin-bottom: 1.5em;">
<span style="font-family: inherit;">1) Development and test of new robust eye tracking and gaze estimation algorithms and optimization of eye tracking algorithms for low power consumption.</span></div>
<div style="color: #555555; font-size: 13px; line-height: 19.1875px; margin-bottom: 1.5em;">
<span style="font-family: inherit;">2) Exploration of novel ways of applying gaze interaction on smartphones, tablets and smartglasses.</span></div>
<div style="color: #555555; font-size: 13px; line-height: 19.1875px; margin-bottom: 1.5em;">
<span style="font-family: inherit;">The project requires a willingness to cooperate closely with the industrial partners involved in this project, i.e. The Eye Tribe, LEGO System A/S and Serious Games Interactive.</span></div>
<div style="color: #555555; font-size: 13px; line-height: 19.1875px; margin-bottom: 1.5em;">
<span style="font-family: inherit;">The ideal candidate will have a strong background in both computer science (especially within computer vision) AND in interaction design (with an experimental approach) OR excellent competence in at least one of them.</span></div>
<div style="color: #555555; font-size: 13px; line-height: 19.1875px; margin-bottom: 1.5em;">
<span style="font-family: inherit;">Contact person: <a href="mailto:paulin@itu.dk" shape="rect" style="color: #8d408e; word-wrap: break-word;">John Paulin Hansen</a></span></div>
<div style="color: #555555; font-size: 13px; line-height: 19.1875px; margin-bottom: 1.5em;">
<span style="font-family: inherit;">Research Group: <a href="http://pitlab.itu.dk/">PitLab</a></span></div>
Martin Tallhttp://www.blogger.com/profile/16023736349313969571noreply@blogger.com0tag:blogger.com,1999:blog-5651110511478705854.post-23236223644414198362013-01-24T09:08:00.000-08:002013-01-24T09:09:59.953-08:00Jason Becker: Not dead yet
<center>
<iframe src="http://player.vimeo.com/video/39405460?title=0&byline=0&portrait=0" width="640" height="380" frameborder="0" webkitAllowFullScreen mozallowfullscreen allowFullScreen></iframe>
</center>Martin Tallhttp://www.blogger.com/profile/16023736349313969571noreply@blogger.com0tag:blogger.com,1999:blog-5651110511478705854.post-32358238876673078692013-01-15T22:31:00.001-08:002013-01-15T22:31:35.993-08:00Eyetribe @ CES2013<center>
<iframe width="560" height="315" src="http://www.youtube.com/embed/SyEqMCwJWkw" frameborder="0" allowfullscreen></iframe>
<br><br>
<iframe width="560" height="315" src="http://www.youtube.com/embed/mDhrtHUt8uc" frameborder="0" allowfullscreen></iframe>
</center>Martin Tallhttp://www.blogger.com/profile/16023736349313969571noreply@blogger.com3tag:blogger.com,1999:blog-5651110511478705854.post-54589603654820161702013-01-02T16:46:00.002-08:002013-01-02T16:47:10.714-08:00ALS patient uses his eyes to DJ New Year's Eve<object id="flashObj" width="580" height="370" classid="clsid:D27CDB6E-AE6D-11cf-96B8-444553540000" codebase="http://download.macromedia.com/pub/shockwave/cabs/flash/swflash.cab#version=9,0,47,0"><param name="movie" value="http://c.brightcove.com/services/viewer/federated_f9?isVid=1&isUI=1" /><param name="bgcolor" value="#FFFFFF" /><param name="flashVars" value="videoId=2067558807001&playerID=1684512068001&playerKey=AQ~~,AAAACC6OgzE~,L0bTvfk9n15FmF18purmAD2hu9UP9YRL&domain=embed&dynamicStreaming=true" /><param name="base" value="http://admin.brightcove.com" /><param name="seamlesstabbing" value="false" /><param name="allowFullScreen" value="true" /><param name="swLiveConnect" value="true" /><param name="allowScriptAccess" value="always" /><embed src="http://c.brightcove.com/services/viewer/federated_f9?isVid=1&isUI=1" bgcolor="#FFFFFF" flashVars="videoId=2067558807001&playerID=1684512068001&playerKey=AQ~~,AAAACC6OgzE~,L0bTvfk9n15FmF18purmAD2hu9UP9YRL&domain=embed&dynamicStreaming=true" base="http://admin.brightcove.com" name="flashObj" width="580" height="370" seamlesstabbing="false" type="application/x-shockwave-flash" allowFullScreen="true" allowScriptAccess="always" swLiveConnect="true" pluginspage="http://www.macromedia.com/shockwave/download/index.cgi?P1_Prod_Version=ShockwaveFlash"></embed></object>Martin Tallhttp://www.blogger.com/profile/16023736349313969571noreply@blogger.com0tag:blogger.com,1999:blog-5651110511478705854.post-44800302449238159922012-11-05T13:42:00.000-08:002018-05-09T15:27:54.292-07:00Lund University HumLab eye tracking equipped classroom/lab.<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj71bjuY56HTgaikwK4tZzraD_xIy3hVSXYhNp2iWnarzCS_bXHnw7GNLKsLjwWZpiN2iJQ6YgGps66iSSFbReMvJ8AmadvozZrXAtLsAmkxHoCfSgntnOKixatCDLM_4HGC-roRqPhHXQ/s1600/SMI_RED_LUND.jpg" imageanchor="1" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj71bjuY56HTgaikwK4tZzraD_xIy3hVSXYhNp2iWnarzCS_bXHnw7GNLKsLjwWZpiN2iJQ6YgGps66iSSFbReMvJ8AmadvozZrXAtLsAmkxHoCfSgntnOKixatCDLM_4HGC-roRqPhHXQ/s320/SMI_RED_LUND.jpg" width="211" /></a>What's better than an eye tracker in a lab? A room full of them! On Thursday the new eye tracking lab at Lund University <a href="http://www.humlab.lu.se/">HumLab</a> was opened. It's housed in the basement of the Center of Language and Linguistics, close to the existing eye tracking lab where I did my Masters thesis on gaze interaction in 2008. The new lab is termed "the digital classroom" and features 25 eye tracking equipped computers for large studies on electronic media and education. During the last ten years the HumLab group have pursued research on the education processes, how students read educational material and how their reading style evolves during university studies. The digital classroom contains eye trackers from German manufacturer <a href="http://www.smivision.com/">SMI</a> (<a href="http://www.smivision.com/en/gaze-and-eye-tracking-systems/products/redm.html">RED-M</a>) and is co-financed by the <a href="http://www.wallenberg.com/kaw/">Wallenberg foundation</a> and Lund University for a total investment of 2.2 million SEK (US$328k). In January a new project starts that aims at improving learning in elementary and high school. Big congrats to <a href="http://www.humlab.lu.se/people/personnel/kennethholmqvist">Kenneth Holmqvist</a> and the team. Very exciting to see the output of this!<br />
<br />
<center>
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh9_gQS2jtwYS8G4GMxSMiPevU6tm1Kz5EtH1_Hfbed-HttCTNuI3I-F_Hefy6I942FmdQHswPepPKjFBEvojsjHDoeUR6jBYgQTyNVbDv2MG0TgPHy2qcWjfrKJs1TLWiYnkG02_ZRoM8/s1600/lund_25_tracker_classroom.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh9_gQS2jtwYS8G4GMxSMiPevU6tm1Kz5EtH1_Hfbed-HttCTNuI3I-F_Hefy6I942FmdQHswPepPKjFBEvojsjHDoeUR6jBYgQTyNVbDv2MG0TgPHy2qcWjfrKJs1TLWiYnkG02_ZRoM8/s640/lund_25_tracker_classroom.jpg" width="640" /></a>
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhHcDKUSV9aJAvPnX-veeCi_Kmj6FsJFSgzZsBF42CUg6GrGZ93QR4RD0_0TqJKE9AsUtRfrUterPQn3UIhjnvYgxYUPkmWiV1GyCurxdRS-ov-xgAxctIcqM2ejboDZ4lQ-e7zmwRDzaA/s1600/lund_new_lab.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="480" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhHcDKUSV9aJAvPnX-veeCi_Kmj6FsJFSgzZsBF42CUg6GrGZ93QR4RD0_0TqJKE9AsUtRfrUterPQn3UIhjnvYgxYUPkmWiV1GyCurxdRS-ov-xgAxctIcqM2ejboDZ4lQ-e7zmwRDzaA/s640/lund_new_lab.jpg" width="640" /></a></div>
<div style="text-align: center;">
<i>Kenneth Holmqvist and <a href="http://www.humlab.lu.se/people/personnel/janaholsanova">Jana Holsanova</a> (on the right)</i></div>
</center>
<br />
By the way, I'm in the process of reading a book by the same group titled "<a href="http://ukcatalogue.oup.com/product/9780199697083.do#.UJgv-cXA_HQ">Eye Tracking: A comprehensive guide to methods and measures</a>". <i>It is, by far, the most </i><i>accurate, </i><i>comprehensive and well-written publication on eye tracking and associated research to this date. </i>A <u>must-read</u> for any serious researcher and/or developer.Martin Tallhttp://www.blogger.com/profile/16023736349313969571noreply@blogger.com0tag:blogger.com,1999:blog-5651110511478705854.post-88771522839397817772012-10-23T13:26:00.004-07:002012-10-23T13:27:57.057-07:00Gaze-controlled droneFrom <a href="http://alexandre.alapetite.fr/">Alexandre Alapetite</a> and <a href="http://www.itu.dk/research/inc/?page_id=3#john">John Paulin Hansen</a>, who I previously did an <a href="http://gazeinteraction.blogspot.dk/2009/05/gaze-controlled-driving.html">eye controlled robot</a> with, comes a demo that shows gaze control of a drone. The user´s gaze is determined by an eye tracking apparatus (<a href="http://www.alea-technologies.de/pages/en/home.php">Alea technology</a>) situated below the display. The drone will fly in the direction that people are looking. The operator is located near the drone. However, he could be situated anywhere, even hundreds of miles away. Gaze Controlled Flying was presented as an interactive demo at the <a href="http://www.nordichi2012.org/">NordiChi 2012</a> conference, October 16, <a href="http://www.itu.dk/">IT University of Copenhagen</a>, Denmark. Cool guys!<br />
<br />
<iframe allowfullscreen="allowfullscreen" frameborder="0" height="360" src="http://www.youtube.com/embed/87IoWDCf9vk" width="640"></iframe>Martin Tallhttp://www.blogger.com/profile/16023736349313969571noreply@blogger.com0tag:blogger.com,1999:blog-5651110511478705854.post-56340189920835950032012-10-23T13:04:00.003-07:002012-10-23T13:04:48.431-07:00European Conference on Eye Movements 2013 announced<br />
<div style="background-color: white; color: #444444; font-size: 15px; line-height: 18px;">
The European Conference on Eye Movements 2013 will be held in Lund, Sweden, from August 11th to 16th 2013. ECEM is the largest and oldest eye tracking conference in the world. The conference webpage is now public: <a href="http://ecem2013.eye-movements.org/" style="color: #cc0000; outline: none; text-decoration: none;">http://ecem2013.eye-movements.org/</a></div>
<div style="background-color: white; color: #444444; font-size: 15px; line-height: 18px;">
<img alt="" class="alignright size-medium wp-image-891" height="69" src="http://blog.humlab.lu.se/wp-content/uploads/2012/09/ecem_logo_blue_200px-text_1-300x69.png" style="border: 0px; float: right; margin: 5px 0px 5px 15px;" title="ecem_logo_blue_200px-text_1" width="300" /></div>
<div style="background-color: white; color: #444444; font-size: 15px; line-height: 18px;">
<br /></div>
<div style="background-color: white; color: #444444; font-size: 15px; line-height: 18px;">
<div>
Next year’s conference will include four panel discussions, 9 keynote speakers and a large number of sessions of 4 to 6 talks. We also include pre-conference methods workshops taught by top experts in the field on diverse topics related to eye movements and eye tracking, open to all researchers at every level, and to members of industry, running from the 7th to 10th August. You can see a list of these topics and the teachers here: <a href="http://ecem2013.eye-movements.org/workshops" style="color: #cc0000; outline: none; text-decoration: none;">http://ecem2013.eye-movements.org/workshops</a>.</div>
<div>
</div>
<div>
</div>
<div>
</div>
<div>
<strong>Important dates</strong> include the following:</div>
<div>
</div>
<div>
<ul>
<li>Oct 15, 2012: Submission of proposals and abstracts will open.</li>
<li> Jan 15, 2013: Deadline for proposals for symposia.</li>
<li>Feb 25, 2013: Notification on acceptance for symposia.</li>
<li>March 1, 2013: Deadline for 2-page extended abstract for talks and 200 word abstracts for posters.</li>
<li>April 1, 2013: Registration opens.</li>
<li>April 15, 2013: Notification on acceptance for talks and posters.</li>
<li>May 1, 2013: Last day for reduced registration fee.</li>
</ul>
</div>
<div>
<div>
</div>
<div>
<strong>Organising committee</strong></div>
<div>
<div>
<ul>
<li>Conference Chairs: <strong>Kenneth Holmqvist</strong> and <strong>Arantxa Villanueva</strong></li>
<li>Conference Organiser: <strong>Fiona Mulvey</strong></li>
<li>Scientific Board: <strong>Halzska Jarodzka</strong>, <strong>Ignace Hooge</strong>, <strong>Rudolf Groner</strong> and <strong>Päivi Majaranta</strong></li>
<li>Exhibition Chairs: <strong>John Paulin Hansen</strong> and <strong>Richard Andersson</strong></li>
<li>Method Workshop Organisers: <strong>Marcus Nyström</strong> and <strong>Dan Witzner Hansen</strong></li>
<li>Web Masters: <strong>Nils Holmberg</strong> and <strong>Detlev Droege</strong></li>
<li>Proceedings Editors: <strong>Roger Johansson</strong> and <strong>Richard Dewhurst</strong></li>
<li>Registration Managers: <strong>Kerstin Gidlöf</strong> and <strong>Linnéa Larsson</strong></li>
<li>Student Volunteer Managers: <strong>Linnéa Larsson</strong>, <strong>Richard Dewhurst</strong> and <strong>Kerstin Gidlöf</strong></li>
<li>Social Program Organisers: <strong>Richard Andersson</strong>, <strong>Jana Holsanova</strong> and <strong>Kerstin Gidlöf</strong></li>
</ul>
</div>
<div>
</div>
</div>
<div>
<div>
<strong>Contact</strong></div>
<div>
<ul>
<li>Conference chairs and organiser: <strong>management</strong> at/på/an <strong>ecem2013.eye-movements.org</strong></li>
<li>Exhibition: <strong>exhibition</strong> at/på/an <strong>ecem2013.eye-movements.org</strong></li>
<li>Method workshops: <strong>workshops</strong> at/på/an <strong>ecem2013.eye-movements.org</strong></li>
<li>The web page: <strong>webmaster</strong> at/på/an <strong>ecem2013.eye-movements.org</strong></li>
</ul>
</div>
</div>
</div>
</div>
Martin Tallhttp://www.blogger.com/profile/16023736349313969571noreply@blogger.com0tag:blogger.com,1999:blog-5651110511478705854.post-20477362434017809842012-10-02T13:57:00.003-07:002012-10-02T14:03:00.416-07:00Fujitsu tablet and monitorToday the first public demonstrations of the Fujitsu/Docomo/Tobii tablet came online, all from the <a href="http://www.ceatec.com/2012/en/index.html">CEATEC</a> 2012 expo in Japan. The prototype tablet, called iBeam, is designed by Fujitsu for Docomo and contains an eye tracking module from Swedish Tobii, namely the <a href="http://www.tobii.com/en/eye-tracking-integration/global/products-services/hardware/tobii-is20-eye-tracker/">IS-20</a> which was introduced <a href="http://gazeinteraction.blogspot.dk/2012/03/tobii-is2-appearing-at-cebit12.html">earlier this year</a>. The form factor appears a bit on the large size with a bump towards the edge where the eye tracking module is placed, sort of looks like a tablet inside another case. On the software side the tablet is running Android where a gaze marker is overlaid on the interface. Selection is performed using simple dwell activation which is known for being both stressful and error-prone. The sample apps contains the usual suspects, panning of photos and maps, scrolling browser and image viewer. Pretty neat for a prototype.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi5LTwELUZ7EkTr7LG_n_xljsbjqCYyVzfHTvYEaLvO2r9VRh200xIHoD23diVY3sVB07OH2Epo71kSz5nExzFVqc7eyX23jxqYh0xDwMqGmKVKLfmadgIQDbVTX0wAGRNkPMEyxXO6r8A/s1600/docomo_tablet_front.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="200" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi5LTwELUZ7EkTr7LG_n_xljsbjqCYyVzfHTvYEaLvO2r9VRh200xIHoD23diVY3sVB07OH2Epo71kSz5nExzFVqc7eyX23jxqYh0xDwMqGmKVKLfmadgIQDbVTX0wAGRNkPMEyxXO6r8A/s200/docomo_tablet_front.png" width="180" /></a><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEggooGPKUl5HFHSQg1Y1bgpB7VDVLB3Atot4FWItFXChm-aEHLIR39AdgQdIMwN9OotjH7xsNKi011QmMWNGyqQCrYIT7pfXuFW2fbg4xwz3yReugj1lLjLB_xnKv3Iiyb0bpHFoCWeIIk/s1600/docomo_tablet_sideview.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="186" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEggooGPKUl5HFHSQg1Y1bgpB7VDVLB3Atot4FWItFXChm-aEHLIR39AdgQdIMwN9OotjH7xsNKi011QmMWNGyqQCrYIT7pfXuFW2fbg4xwz3yReugj1lLjLB_xnKv3Iiyb0bpHFoCWeIIk/s200/docomo_tablet_sideview.png" width="200" /></a></div>
<br />
<center>
<script src="http://player.ooyala.com/player.js?width=560&height=315&embedCode=A0cWsyNjpZfyH3ithbYYsDaSlJNUKl9i&deepLinkEmbedCode=A0cWsyNjpZfyH3ithbYYsDaSlJNUKl9i&video_pcode=ppbnY65tdYh_HxFfIkVstq2Iq_oQ"></script>
</center>
<br />
<br />
Fujitsu also demonstrated a LCD monitor with an eye tracking camera system embedded while the actual gaze estimation algorithms are running on an embedded Windows computer. This display is not using the Tobii IS20 but a system developed by Fujitsu themselves which is stated to be low-cost. Question is why they didn't use this for the tablet. From what I can tell it does not provide the same level of accuracy, it appears to be a rough up/down, left/right type of gaze estimation which explains why the demo apps only handles panning of maps and images.<br />
<br />
<center>
<iframe frameborder="0" height="327" id="viddler-f702cdac" mozallowfullscreen="true" src="//www.viddler.com/embed/f702cdac/?f=1&autoplay=0&player=simple&secret=39578043&loop=false&nologo=false&hd=false" webkitallowfullscreen="true" width="545"></iframe>
</center>
Martin Tallhttp://www.blogger.com/profile/16023736349313969571noreply@blogger.com0tag:blogger.com,1999:blog-5651110511478705854.post-89284854696565234822012-10-02T12:57:00.001-07:002012-10-02T12:57:53.753-07:00Panasonic in-flight eye control demoFrom the <a href="http://apex.aero/">APEX 2012</a> here's a video where <a href="http://www.linkedin.com/pub/steven-sizelove/18/bb8/8a3">Steve Sizelove</a> from Panasonic demonstrates their eye and gesture control systems for future in-flight entertainment systems. Even if this is a futuristic concept it is clear that Panasonic is pushing the envelope on in-flight systems. Their <a href="http://panasonic.aero/Products/XSeries.aspx">X-series </a>system is state-of-the-art, just take a look at the upcoming <a href="http://www.thefutureofifec.com/">eX3</a>, a touch-enabled Android platform with an associated app store, support for the Unity 3D engine, fast internet etc. Great stuff for those transatlantic flights that seem to take forever.<br />
<br />
<center>
<iframe allowfullscreen="allowfullscreen" frameborder="0" height="360" src="http://www.youtube.com/embed/o8TyFCoIUFY" width="480"></iframe>
</center>
Martin Tallhttp://www.blogger.com/profile/16023736349313969571noreply@blogger.com0tag:blogger.com,1999:blog-5651110511478705854.post-85538508886677623692012-08-16T14:36:00.003-07:002012-08-20T05:02:43.375-07:00Tough decisions, big plans and a bright future<div>
Browsed through my blog today. Realized I hadn't written much about what I've been up to. There's been a reason for that. One year ago I left my position at <a href="http://www.duke.edu/">Duke University</a>. It wasn't an easy decision. The Radiology eye tracking project I was involved with (and still is) was making good progress. I had been working long days since it started at <a href="http://www.stanford.edu/">Stanford</a> in 2009 and we were doing pretty neat stuff with volumetric medical image datasets. </div>
<div>
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhobs8LehKC3rIERcLYFBICH9lf2sBCOWlaJtliSIX78GgbPNydiOMhYh1zMJN6gbXD1QnYZ0POVmO8lNO0guQFA39Cb3d0osmxZ7GQ62ToZx9falePGpm-rAFmBFlw7awOoD7UzrwFAb8/s1600/3DGazePaths.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="180" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhobs8LehKC3rIERcLYFBICH9lf2sBCOWlaJtliSIX78GgbPNydiOMhYh1zMJN6gbXD1QnYZ0POVmO8lNO0guQFA39Cb3d0osmxZ7GQ62ToZx9falePGpm-rAFmBFlw7awOoD7UzrwFAb8/s400/3DGazePaths.png" width="400" /></a></div>
<div style="text-align: center;">
<i>The Stanford/Duke Radiology eye tracking project and our novel approach to volumetric gaze data.</i></div>
<div>
<br /></div>
<div>
At the same time I spent nights and weekends working on the open source <a href="http://sourceforge.net/projects/gazetrackinglib/">ITU Gaze Tracker</a> together with Javier San Agustin. Somewhere I always had the feeling that we should get back together, great things just seemed to happen when we did. So after my grand tour of the US and countless Skype meetings over six months we had a plan. The four former PhD students from the <a href="http://www.gazegroup.org/">ITU Gazegroup</a> was to start an eye tracking company. At first we called it <i><a href="http://www.senseye.net/">Senseye</a></i> but later changed it to <i><a href="http://theeyetribe.com/">The Eye Tribe</a></i> due to trademark issues. </div>
<div>
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
</div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhVfPP19FdZUGX3wWW5ydP6W37OH1LO5RKfLou0JADKecSD57qnJWu9BNLcr-TeBIiREmovN9VLNUn1VFNcW5dMp6LE-WfBFc9KIs_NzV1OsqnO1STc4phxG_xyFNCZbHY6H37YXdwxEQg/s1600/380335_290522524370942_246151968807998_636466_2119822408_n.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="325" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhVfPP19FdZUGX3wWW5ydP6W37OH1LO5RKfLou0JADKecSD57qnJWu9BNLcr-TeBIiREmovN9VLNUn1VFNcW5dMp6LE-WfBFc9KIs_NzV1OsqnO1STc4phxG_xyFNCZbHY6H37YXdwxEQg/s400/380335_290522524370942_246151968807998_636466_2119822408_n.jpg" width="500" /></a></div>
<div style="text-align: center;">
<i>The Eye Tribe as of Spring 2012 at the US embassy reception. </i></div>
<div style="text-align: center;">
<i><br /></i></div>
<div>
We decided early not to go for the established market. It's a red sea with a couple of fairly big players that have been working on their high tech creations for years, it's a low volume/high margin game with intricate and expensive solutions primarily for the research and disabled markets. </div>
<div>
<br /></div>
<div>
<a href="http://www.theeyetribe.com/">The Eye Tribe</a> intends to <i>innovate and disrupt </i>by bringing eye tracking to post-pc devices in the <i>consumer</i> market. It just doesn't happen with devices that costs several thousand dollars. </div>
<div>
<br /></div>
<div>
After twelve months of executing our plan we recently raised funds from a group of European investors to accelerate (as covered by <a href="http://thenextweb.com/eu/2012/08/16/the-eye-tribe-raises-800000-let-control-phone-eyes/">The Next Web</a>). The team has grown and we are looking to make additional hires in a near future. Perhaps <i><b>you</b></i> would like <a href="http://theeyetribe.com/careers/">join the tribe</a> and be part of something great? There's some very interesting things happening in a near future, for the skilled it's always best to get on early.</div>
<div>
<br /></div>
<div>
One year ago I traded a warm North Carolina for a cold Copenhagen, a relationship for loneliness, a big house for a small apartment and a sport car for a bicycle. Time will tell if that was the right thing to do, with big plans, full commitment and funding in place, it is so far, so good.</div>
<div>
<br /></div>
Martin Tallhttp://www.blogger.com/profile/16023736349313969571noreply@blogger.com1tag:blogger.com,1999:blog-5651110511478705854.post-61696294448941786392012-07-30T09:22:00.000-07:002012-07-30T09:24:45.498-07:00What the mind can conceive, it can achieve.Today marks a historic day as the <a href="http://www.itu.dk/">ITU</a> <a href="http://gazegroup.org/">GazeGroup.org</a> open source <a href="http://www.gazegroup.org/downloads/23-gazetracker/">Gaze Tracker</a> has been downloaded over <a href="http://sourceforge.net/projects/gazetrackinglib/files/stats/map?dates=2009-03-19%20to%202012-07-30">30,000 times</a>. Although the current version was released in October 2010 we're still seeing approximately 1000 downloads per month. We're really happy to see how widely distributed it has become, reaching all corners of the planet. When we released the first version, back in 2009, we had no idea it would reach distant places such as <span style="font-family: inherit;">Kyrgyzstan</span>, Suriname or Burkina Faso. The objective was, and still is, to "democratize and provide access to eye tracking technology regardless of means or nationality". This milestone is an achievement that I'd like to thank everyone involved for.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiHZZcqP1V1zVoWU3_K-vF11By55NIIV1UGmhOwXtLDa_tAs1YnKkJ7-qm7AIEnZBIrBEd5-cm3PVf5OoQoU2Es-luFkpK8spZS-ccNV88Hw6KMaAZip9ZSxCFdtDq9VNdDA86yhyi5hyo/s1600/30k_downloads.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiHZZcqP1V1zVoWU3_K-vF11By55NIIV1UGmhOwXtLDa_tAs1YnKkJ7-qm7AIEnZBIrBEd5-cm3PVf5OoQoU2Es-luFkpK8spZS-ccNV88Hw6KMaAZip9ZSxCFdtDq9VNdDA86yhyi5hyo/s1600/30k_downloads.png" /></a></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<span style="font-family: inherit;"><u>Top 10 Countries</u></span><br />
<span style="background-color: white;"><span style="font-family: inherit;">10. <span style="line-height: 18px;">Brazil </span><span style="line-height: 18px; text-align: right;">720</span></span></span><span style="background-color: white;"><span style="font-family: inherit;"> </span></span><br />
<span style="background-color: white;"><span style="font-family: inherit;">9. <span style="line-height: 18px;">Denmark </span><span style="line-height: 18px; text-align: right;">803</span></span></span><span style="background-color: white;"><span style="font-family: inherit;"> </span></span><br />
<span style="background-color: white;"><span style="font-family: inherit;">8. <span style="line-height: 18px;">France </span><span style="line-height: 18px; text-align: right;">865</span></span></span><span style="background-color: white;"><span style="font-family: inherit;"> </span></span><br />
<span style="background-color: white;"><span style="font-family: inherit;">7. <span style="line-height: 18px;">China </span><span style="line-height: 18px; text-align: right;">888</span></span></span><span style="background-color: white;"><span style="font-family: inherit;"> </span></span><br />
<span style="background-color: white;"><span style="font-family: inherit;">6. <span style="line-height: 18px;">India </span><span style="line-height: 18px; text-align: right;">1,063</span></span></span><span style="background-color: white;"><span style="font-family: inherit;"> </span></span><br />
<span style="background-color: white;"><span style="font-family: inherit;">5. <span style="line-height: 18px;">Italy </span><span style="line-height: 18px; text-align: right;">1,170</span></span></span><span style="background-color: white;"><span style="font-family: inherit;"> </span></span><br />
<span style="background-color: white;"><span style="font-family: inherit;">4. <span style="line-height: 18px;">Japan </span><span style="line-height: 18px; text-align: right;">1,226</span></span></span><span style="background-color: white;"><span style="font-family: inherit;"> </span></span><br />
<span style="background-color: white;"><span style="font-family: inherit;">3. <span style="line-height: 18px;">United Kingdom </span><span style="line-height: 18px; text-align: right;">1,359</span></span></span><span style="background-color: white;"><span style="font-family: inherit;"> </span></span><br />
<span style="background-color: white;"><span style="font-family: inherit;">2. <span style="line-height: 18px;">Germany </span><span style="line-height: 18px; text-align: right;">2,266</span></span></span><span style="background-color: white;"><span style="font-family: inherit;"> </span></span><br />
<span style="background-color: white;"><span style="font-family: inherit;">1. <span style="line-height: 18px;">United States </span><span style="line-height: 18px; text-align: right;">4,647</span></span></span><br />
<span style="background-color: white;"><span style="font-family: inherit;"><span style="line-height: 18px; text-align: right;"><br /></span></span></span><span style="background-color: white; font-size: 13px; line-height: 18px; text-align: right;"><span style="font-family: inherit;">Top 10 total: 15,007 (50%) <a href="http://sourceforge.net/projects/gazetrackinglib/files/stats/map?dates=2009-03-19%20to%202012-07-30" style="color: #555555;">Full stats</a><span style="color: #555555;">.</span></span></span>Martin Tallhttp://www.blogger.com/profile/16023736349313969571noreply@blogger.com1tag:blogger.com,1999:blog-5651110511478705854.post-86904353927509900412012-06-28T06:14:00.002-07:002012-06-28T06:14:53.370-07:00Dual scene-camera head-mounted eye tracking rig from Lancaster Uni.From the <a href="http://pervasiveconference.org/2012/">Pervasive 2012</a> conference held last week in Newcastle comes a demo of a dual scene-camera head-mounted eye tracking rig that enables users to move objects between two displays using the gaze position. The larger display acts as the "public" display (<span style="background-color: white;">digital signage etc.) while the smaller represents the personal handheld tablet/smartphone. Nifty idea from </span><span style="background-color: white;"><a href="http://jaysonturner.com/">Jayson Turner</a>, </span><span style="background-color: white;"><a href="https://www.andreas-bulling.de/">Andreas Bulling</a> and </span><span style="background-color: white;"><a href="http://ubicomp.lancs.ac.uk/~hwg/">Hans Gellersen</a>, all from the <a href="http://eis.comp.lancs.ac.uk/">Embedded Interactive Systems</a> group at <a href="http://www.lancs.ac.uk/">Lancaster University</a>. </span><br />
<br />
<center>
<iframe allowfullscreen="" frameborder="0" height="360" src="http://www.youtube.com/embed/2APbNU8bHgE" width="640"></iframe>
</center>Martin Tallhttp://www.blogger.com/profile/16023736349313969571noreply@blogger.com0tag:blogger.com,1999:blog-5651110511478705854.post-54427856036343816692012-06-19T02:02:00.000-07:002012-06-19T02:06:48.493-07:00The Eye Tribe presents worlds first eye controlled Windows 8 tabletIt slices, it dices! <a href="http://theeyetribe.com/">The Eye Tribe</a> from Copenhagen introduces the worlds first Windows 8 eye tracking tablet. The small, lightweight add-on connects via USB, no additional cables or batteries needed. For the time being the specs are 30Hz, accuracy of 0.5 degrees and an exceptionally large tracking range. More info to follow.<br />
<br />
<iframe allowfullscreen="" frameborder="0" height="360" src="http://www.youtube.com/embed/ef0qLb8-4k8" width="640"></iframe><br />
<br />
<br />
The Eye Tribe, formerly known as <a href="http://gazeinteraction.blogspot.dk/search?q=senseye">Senseye</a>, have made significant progress in recent months. In January they won the Danish <a href="http://www.venturecup.dk/events/senseye/">Venture Cup</a>. Then went on to participate in the <a href="http://rbpc.rice.edu/">Rice RBPC</a>, the worlds premier business plan competition, made it to the semi-finals and was awarded "<a href="http://www.incontextsolutions.com/press-and-events/press-releases-and-events/bid/137524/InContext-Solutions-Awards-Senseye-the-Disruptive-Technology-Award-During-2012-Rice-Business-Plan-Competition">Most Disruptive Technology</a>" while being mentioned in <a href="http://management.fortune.cnn.com/2012/04/10/rice-university-business-plan-competition-2/">Fortune Magazine</a> and <a href="http://www.chron.com/business/article/Business-teams-battle-in-Rice-contest-3481500.php">Houston Chronicle</a>. In May the team won the <a href="http://mobihealthnews.com/17388/eye-control-for-smartphones-tablets-wins-european-ehealth-challenge/">eHealth Innovation Contest</a> followed by the audience award at the Danish <a href="http://symbion.dk/symbion-nyhedsvisning/article/accelerace-deltagere-skal-repraesentere-danmark-ved-tech-all-stars-event/">Accelerace</a> whereby they were selected to participate at the <a href="http://techallstars.eu/">Tech All Stars</a> event which gives the most promising European startups the opportunity to pitch at the <a href="http://www.leweb.co/">LeWeb</a> conference in London on June 20th.<br />
<br class="Apple-interchange-newline" />Martin Tallhttp://www.blogger.com/profile/16023736349313969571noreply@blogger.com0tag:blogger.com,1999:blog-5651110511478705854.post-9715643818494646702012-06-08T02:32:00.001-07:002012-06-08T02:32:36.402-07:00Eyecatcher - A 3D prototype combining Eyetracking with a Gestural CameraEyecatcher is a prototype combining eyetracking with a gestural camera on a dual screen setup. Created for the Oilrig process industry, this project was a collaborative exploration between <a href="http://www.abb.com/cawp/abbzh254/c88a9570464dc1fbc1256b570051a4bd.aspx">ABB Corporate Research</a> and <a href="http://www.tii.se/groups/umea">Interactive Institute Umeå</a> (<a href="http://interactiondesign.se/">blog</a>).<br />
<br />
<br />
<center>
<iframe allowfullscreen="" frameborder="0" height="337" mozallowfullscreen="" src="http://player.vimeo.com/video/40518613" webkitallowfullscreen="" width="600"></iframe> <a href="http://vimeo.com/40518613">
</a></center>Martin Tallhttp://www.blogger.com/profile/16023736349313969571noreply@blogger.com0tag:blogger.com,1999:blog-5651110511478705854.post-86551231359781942312012-06-03T13:28:00.000-07:002012-06-03T13:29:37.424-07:00Copenhagen Business School: PhD position available<table border="0" cellpadding="0" cellspacing="0" class="DataTable" style="font-family: Arial, Verdana, Helvetica, Geneva, Lucida, sans-serif; padding: 3px 5px 3px 3px;"><tbody>
<tr><td style="color: #002946; font-size: 12px;" width="15%"><img height="1" src="https://ssl1.peoplexs.com/Peoplexs22/images/RC_clear.gif" width="1" /></td><td style="color: #002946; font-size: 12px;" width="35%"><img height="1" src="https://ssl1.peoplexs.com/Peoplexs22/images/RC_clear.gif" width="1" /></td><td style="color: #002946; font-size: 12px;" width="50%"><img height="1" src="https://ssl1.peoplexs.com/Peoplexs22/images/RC_clear.gif" width="1" /></td></tr>
<tr><td class="Datalabel" colspan="3" style="color: #002946; font-size: 12px; font-weight: bold; margin-right: 5px;"></td></tr>
<tr><td class="Datafield" colspan="3" style="color: #002946; font-size: 12px; margin-left: 5px;" valign="top"><a href="http://www.cbs.dk/">Copenhagen Business School</a> invites applications for a vacant PhD scholarship in empirical modeling of eye movements in reading, writing and translation. The PhD position is offered at the <a href="http://www.cbs.dk/Forskning/Institutter-centre/Institutter/IBC">Department of International Business Communication</a> at the Copenhagen Business School (CBS). The Department of International Business Communication is a new department at CBS whose fields of interest include the role of language(-s) in interlingual and intercultural communication, the role of language and culture competences in organizations, the role of language and culture in communication technology and social technologies, as well as the teaching of language skills. The Department is dedicated to interdisciplinary and problem-oriented research.<br />
<br />
Considerable progress has been made in eye-tracking technology over the past decade, allowing to capture gaze behavior with free head movements. However, the imprecision of the measured signal makes it difficult to analyze the eye-gaze movement in reading tasks where a precise local resolution of the gaze samples is required to track the reader's gaze path over a text. The PhD position will investigate methods to cancel out the noise from the gaze signal. The PhD candidate will investigate, design and implement empirically-based models of eye-gaze movements in reading which take into account physical properties of the visual system in addition to background information, such as the purpose of the reading activity, the structure of the text, the quality of the gaze signal, etc. The PhD candidate should have:<br />
<ul>
<li>an interest in cognitive modeling of human reading, writing and translation processes</li>
<li>a basic understanding of browser and eye-tracking technology</li>
<li>knowledge of probability theory and statistical modeling</li>
<li>advanced programming skills</li>
</ul>
<div>
More information <a href="http://www.cbs.dk/Om-CBS-Campus/Jobs-paa-CBS/Ledige-stillinger/Menu/Ph.d.studerende">available here</a>.</div>
</td></tr>
</tbody></table>Martin Tallhttp://www.blogger.com/profile/16023736349313969571noreply@blogger.com1tag:blogger.com,1999:blog-5651110511478705854.post-88716147064714142442012-06-01T07:34:00.001-07:002012-06-01T07:34:53.552-07:00Temporal Control In the EyeHarp Gaze-Controlled Musical Interface<br />
The <a href="http://theeyeharp.blogspot.dk/">EyeHarp</a> that I <a href="http://gazeinteraction.blogspot.dk/2011/06/eyeharp-eye-tracking-based-musical.html">wrote about</a> last summer is a gaze controlled musical instrument build by Zacharias Vamvakousis. In the video below he demonstrates how the interface is driven by the <a href="http://www.gazegroup.org/">ITU Gaze Tracker</a> and used to compose a loop which then improvise upon. On the hardware side a modified PS3 camera is used in combination with two infrared light sources. This setup was presented in New Interfaces for Musical Expression (NIME 2012) conference in Detroit a week ago, while it will be exhibited in Sonar, Barcelona on 14-16, June 2012. Great to see that such innovative interface being made open source and combined with the ITU tracker. <br />
<br />
<iframe width="560" height="315" src="http://www.youtube.com/embed/dBvWW-emzGM" frameborder="0" allowfullscreen></iframe>
<ul>
<li>Vamvakousis, Z. and Ramirez, R. (2012) Temporal Control In the EyeHarp Gaze-Controlled Musical Interface. In the proceedings on the 12th International Conference on New Interfaces for Musical Expression. 21-23 May 2012. Ann Arbor, Michigan, USA. (<a href="http://mtg.upf.edu/system/files/publications/Temporal%20Control%20In%20the%20EyeHarp%20Gaze-Controlled%20Musical%20Interface.pdf">PDF</a>)</li>
</ul>Martin Tallhttp://www.blogger.com/profile/16023736349313969571noreply@blogger.com0tag:blogger.com,1999:blog-5651110511478705854.post-65001904423995455032012-04-23T00:41:00.002-07:002012-07-30T10:33:56.520-07:00Noise Challenges in Monomodal Gaze Interaction (Skovsgaard, 2011)<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgRdGZVIn-FxMuiDWF5mdY5lcQ73gHw2jdgsUg84UTOqeBL85Oi56TJcr61QYhzUGphhxOXKN_Ojmo83Kl-KslptuzKEhIX_Wlv-DNuuxNrxU_82rNOhrSmcCbuTJx-uAIutI2qUzaspiM/s1600/henrik.png" imageanchor="1" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em; text-align: left;"><span style="font-family: inherit;"><img border="0" height="200" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgRdGZVIn-FxMuiDWF5mdY5lcQ73gHw2jdgsUg84UTOqeBL85Oi56TJcr61QYhzUGphhxOXKN_Ojmo83Kl-KslptuzKEhIX_Wlv-DNuuxNrxU_82rNOhrSmcCbuTJx-uAIutI2qUzaspiM/s200/henrik.png" width="138" /></span></a><span style="font-family: inherit;"></span><br />
<div>
<span style="font-family: inherit;"><span style="color: #333333; font-family: inherit; line-height: 11.25pt;">Henrik </span></span>Skovsgaard of the ITU <a href="http://www.gazegroup.org/">Gaze Group</a> successfully defended his PhD thesis “Noise Challenges in Monomodal Gaze Interaction” at the IT University of Copenhagen on the 13th December 2011. The PhD thesis can be downloaded <a href="http://gazegroup.org/documents/Henrik_Skovsgaard_PhD.pdf">here</a>.<span style="font-family: inherit;"><span style="color: #333333; font-family: inherit; line-height: 11.25pt;"> </span><span style="color: #333333; font-family: inherit; line-height: 11.25pt;"> </span></span></div>
<br />
<div style="line-height: 18px; margin-bottom: 1em; margin-top: 1em; padding-bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px;">
<div style="text-align: left;">
<strong style="color: #333333;"><span style="color: #666666;"><span style="background-color: white; font-family: inherit;">ABSTRACT</span></span></strong></div>
</div>
Modern graphical user interfaces (GUIs) are designed with able-bodied users in mind. Operating these interfaces can be impossible for some users who are unable to control the conventional mouse and keyboard. An eye tracking system offers possibilities for independent use and improved quality of life via dedicated interface tools especially tailored to the users’ needs (e.g., interaction, communication, e-mailing, web browsing and entertainment). Much effort has been put towards robustness, accuracy and precision of modern eye-tracking systems and there are many available on the market. Even though gaze tracking technologies have undergone dramatic improvements over the past years, the systems are still very imprecise. This thesis deals with current challenges of monomodal gaze interaction and aims at improving access to technology and interface control for users who are limited to the eyes only. Low-cost equipment in eye tracking contributes toward improved affordability but potentially at the cost of introducing more noise in the system due to the lower quality of hardware. This implies that methods of dealing with noise and creative approaches towards getting the best out of the data stream are most wanted. The work in this thesis presents three contributions that may advance the use of low-cost monomodal gaze tracking and research in the field:<ul style="color: #333333; line-height: 18px; list-style: none; margin: 1em 0px; padding: 0px;">
<li class="MsoNormal" style="background-image: url(http://www.gazegroup.org/templates/ja_purity/images/bullet.gif); background-position: 18px 8px; background-repeat: no-repeat no-repeat; color: black; line-height: 21px; padding-left: 30px; text-align: left;"><span style="background-color: white; font-family: inherit;">An assessment of a low-cost open-source gaze tracker and two eye tracking systems through an accuracy and precision test and a performance evaluation. </span></li>
<li class="MsoNormal" style="background-image: url(http://www.gazegroup.org/templates/ja_purity/images/bullet.gif); background-position: 18px 8px; background-repeat: no-repeat no-repeat; color: black; line-height: 21px; padding-left: 30px; text-align: left;"><span style="background-color: white; font-family: inherit;">Development and evaluation of a novel innovative 3D typing system with high tolerance to noise that is based on continuous panning and zooming.</span></li>
<li class="MsoNormal" style="background-image: url(http://www.gazegroup.org/templates/ja_purity/images/bullet.gif); background-position: 18px 8px; background-repeat: no-repeat no-repeat; color: black; line-height: 21px; padding-left: 30px; text-align: left;"><span style="background-color: white; font-family: inherit;">Development and evaluation of novel selection tools that compensate for noisy input during small-target selections in modern GUIs. </span></li>
</ul>
<div style="text-align: left;">
<span style="background-color: white; font-family: inherit;"><span style="line-height: 18px;">This thesis may be of particular interest for those working on the use of eye trackers for gaze interaction and how to deal with reduced data quality. </span><span style="line-height: 18px;">The work in this thesis is accompanied by several software applications developed for the research projects that can be freely downloaded from the eyeInteract appstore (</span><a href="http://www.eyeinteract.com/" style="color: #006699; line-height: 18px;">http://www.eyeinteract.com</a><span style="line-height: 18px;">).</span></span></div>
<div style="text-align: left;">
<span style="background-color: white; font-family: inherit;"><br /></span></div>
<div style="text-align: left;">
<strong style="color: #333333; line-height: 18px;"><span style="color: #666666;"><span style="background-color: white; font-family: inherit;">SUPERVISORS</span></span></strong></div>
<ul style="color: #333333; line-height: 18px; list-style: none; margin: 1em 0px; padding: 0px;">
<a href="http://gazegroup.org/documents/Henrik_Skovsgaard_PhD.pdf" imageanchor="1" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em; text-align: left;"><span style="font-family: inherit;"><img border="0" height="200" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiDLxuk7btBAZBo4XDQSpUoJVIfIPWRgjo-jsY7fLGeFf09Cev67IS7ZpkYzBAGa9J5M-PIUWb8wsOLesfKhOduL71zc0kjeaS6iE1clgbcry9u4wUukZ7A3kgKVQ9jNeu0NeGbPPDjAbU/s200/skovsgaard.png" width="155" /></span></a>
<li class="MsoNormal" style="background-image: url(http://www.gazegroup.org/templates/ja_purity/images/bullet.gif); background-position: 18px 8px; background-repeat: no-repeat no-repeat; line-height: 21px; padding-left: 30px; text-align: left;"><span style="background-color: white; font-family: inherit;">Associate professor <a href="http://www.itu.dk/research/inc/?page_id=3#john" style="color: #006699;">John Paulin Hansen</a> - ITU Copenhagen (main supervisor)</span></li>
<li class="MsoNormal" style="background-image: url(http://www.gazegroup.org/templates/ja_purity/images/bullet.gif); background-position: 18px 8px; background-repeat: no-repeat no-repeat; line-height: 21px; padding-left: 30px; text-align: left;"><span style="background-color: white; font-family: inherit;">Associate professor <a href="http://www.itu.dk/research/inc/?page_id=3#dan" style="color: #006699;">Dan Witzner Hansen</a> - ITU Copenhagen (secondary supervisor)</span></li>
</ul>
<h1 style="color: #333333; line-height: 18px; margin: 1em 0px; padding: 0px; text-align: left;">
<span style="background-color: white; font-family: inherit; font-size: small;"><strong><span style="color: #666666;">ASSESSMENT COMMITTEE</span></strong><span style="line-height: 11.25pt;"> </span></span></h1>
<ul style="color: #333333; line-height: 18px; list-style: none; margin: 1em 0px; padding: 0px;">
<li style="background-position: 18px 8px; line-height: 21px; padding-left: 30px; text-align: left;"><span style="line-height: 11.25pt;"><span style="background-color: white; font-family: inherit;">Professor <a href="http://www.kasperhornbaek.dk/" style="color: #006699;">Kasper Hornbæk</a> - University of Copenhagen, Denmark</span></span></li>
<li style="background-position: 18px 8px; line-height: 21px; padding-left: 30px; text-align: left;"><span style="line-height: 11.25pt;"><span style="background-color: white; font-family: inherit;">Associate professor <a href="http://www.yorku.ca/mack/" style="color: #006699;">Scott MacKenzie</a> - York University, Canada</span></span></li>
<li style="background-position: 18px 8px; line-height: 21px; padding-left: 30px; text-align: left;"><span style="line-height: 11.25pt;"><span style="background-color: white; font-family: inherit;">Associate professor <a href="http://www.itu.dk/research/inc/?page_id=3#thomas" style="color: #006699;">Thomas Pederson</a> - IT University of Copenhagen, Denmark (chairman)</span></span></li>
</ul>Martin Tallhttp://www.blogger.com/profile/16023736349313969571noreply@blogger.com1tag:blogger.com,1999:blog-5651110511478705854.post-82115108377617269352012-03-12T08:59:00.002-07:002012-03-12T09:04:55.954-07:00SMI RED-MWell, well, look here. A constellation of eye tracking manufacturers are joining in on the <i>affordable</i> market, perhaps defined some time ago by Mirametrix who launched at @ $5k. Tobii have their PC Eye, perfectly fine but at a cool $7k and is showcasing the new IS2 chipset but apparently can't do CEBIT12 demos. The new player is <a href="http://www.smivision.com/" target="_blank">Sensomotoric Instruments</a>, known for their high quality hardware and finely tuned algorithms. Their new contribution is the RED-M (M is for mini?). Even if the price hasn't been announced I would assume it's less than it's high speed fire-wire sibling, perhaps similar to the PCEye pricing? <br />
<br />
The <a href="http://www.smivision.com/en/gaze-and-eye-tracking-systems/products/redm.html" target="_blank">M-version</a> is a small device made out of plastics that connects via USB 2.0 (assuming two plugs, one for power), it measures 240x25x33mm - that's pretty small and it's only 130 grams. This is a big difference from their prior models which have been very solid and made out of high quality materials and professional components. The accuracy is specified to 0.5deg, 50-75cm distance where the box is 320x210mm @ 60cm with a sample rate of 60/120Hz, in essence it's the low end version of the RED series where the top model is the super fast RED500 . Although it has yet to be demonstrated in operational state some material has appeared online. Below is the animated setup guide, you can find <a href="http://www.smivision.com/en/gaze-and-eye-tracking-systems/products/redm.html" target="_blank">more information</a> on their website. Looking good!<br />
<br />
<div style="text-align: center;">
<iframe allowfullscreen="" frameborder="0" height="360" src="http://www.youtube.com/embed/fcZN1gG-Dqw" width="640"></iframe>
</div>
<ul>
<li><a href="http://www.smivision.com/fileadmin/user_upload/downloads/product_flyer/prod_smi_redm_techspecs.pdf" target="_blank">Technical specs</a> (pdf)</li>
<li><a href="http://www.smivision.com/fileadmin/user_upload/downloads/product_flyer/prod_smi__redm_eyetracker.pdf" target="_blank">Flyer </a>(pdf)</li>
</ul>Martin Tallhttp://www.blogger.com/profile/16023736349313969571noreply@blogger.com0tag:blogger.com,1999:blog-5651110511478705854.post-38051583736594338442012-03-05T02:38:00.004-08:002012-03-05T05:53:22.725-08:00RealGaze GlassesJust came across the <a href="http://realgaze.com/home.html" target="_blank">RealGaze glasses</a> which is being developed by Devon Greco et al. He's father was diagnosed with ALS some years ago and given that Devon has been tinkering with electronics since early on he set out to build an eye tracker. For a prototype the result looks good, I guess the form factor feels familiar. There isn't too much meat available at the moment other than big ambitions to manufacturer an affordable device. Most of us would love to see that happen!<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
</div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgxuTg_mMVOv7s27qRZ7rUOvhqHqmbQHJiXnOweKFeiFuamAj81W36bis6zG2yffrKo2ExLxIbIlRwOhFEVznj-ID-VXTdZdiiwRVgU7TsIl4xfEPS6k6V6-332PUZ2by556YhKrBqrwBU/s1600/realgaze.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="352" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgxuTg_mMVOv7s27qRZ7rUOvhqHqmbQHJiXnOweKFeiFuamAj81W36bis6zG2yffrKo2ExLxIbIlRwOhFEVznj-ID-VXTdZdiiwRVgU7TsIl4xfEPS6k6V6-332PUZ2by556YhKrBqrwBU/s640/realgaze.png" width="640" /></a></div>
<br />
<div class="separator" style="clear: both; text-align: center;">
</div>
<br />
<br />
<center><iframe allowfullscreen="" frameborder="0" height="380" src="http://www.youtube.com/embed/3oT09uLwRnM" width="540"></iframe></center>Martin Tallhttp://www.blogger.com/profile/16023736349313969571noreply@blogger.com1tag:blogger.com,1999:blog-5651110511478705854.post-17491246152334955332012-02-16T10:30:00.001-08:002012-02-16T10:30:16.774-08:00Eyewriter & Not Impossible FoundationThe <a href="http://eyewriter.org/" target="_blank">Eyewriter</a> project which helped Tony 'TemptOne' Quan to draw again was originally document by Mick <a href="http://theebelinggroup.com/" target="_blank">Ebeling</a>. This material has been incorporated into a documentary called "Getting up" and recently won the audience award at the <a href="http://www.slamdance.com/" target="_blank">Slamdance</a>. Movie buff Christopher Campbell wrote a <a href="http://blogs.indiewire.com/spout/getting-up-review" target="_blank">short review</a> on his blog. Great job on raising awareness, hope you guys find funding to further develop the software.<br />
<br />
<br />
<br />
<center>
<iframe allowfullscreen="" frameborder="0" height="325" mozallowfullscreen="" src="http://player.vimeo.com/video/34535124" webkitallowfullscreen="" width="500"></iframe></center><center><a href="http://vimeo.com/34535124">Getting Up: The Tempt One Story Trailer</a><br />
</center>
<br />
<br />
<br />
<br />
<center>
<iframe allowfullscreen="" frameborder="0" height="325" mozallowfullscreen="" src="http://player.vimeo.com/video/34572885?title=0&byline=0&portrait=0" webkitallowfullscreen="" width="500"></iframe></center><center><a href="http://vimeo.com/34572885">How to build an EyeWriter</a><br />
</center>Martin Tallhttp://www.blogger.com/profile/16023736349313969571noreply@blogger.com0tag:blogger.com,1999:blog-5651110511478705854.post-82905260876725403342012-02-15T10:26:00.001-08:002012-02-15T10:47:22.664-08:00Prelude for ETRA2012The <a href="http://www.etra2012.org/program.html" target="_blank">program</a> for the <a href="http://www.etra2012.org/" target="_blank">Eye Tracking Research & Applications</a> (ETRA'12) is out and contains several really interesting papers this year.<br />
<br />
Two supplementary videos surfaced the other day and comes from the <a href="http://wwwisg.cs.uni-magdeburg.de/uise/news/en" target="_blank">User Interface & Software Engineering group</a> at the Otto-von-Guericke-Universität in Germany. In addition the authors, <a href="http://wwwisg.cs.uni-magdeburg.de/isg/stellmach.html.en" target="_blank">Sophie Stellmach</a> and <a href="http://isgwww.cs.uni-magdeburg.de/~dachselt/" target="_blank">Raimund Dachselt</a>, have a paper submitted for the <a href="http://chi2012.acm.org/" target="_blank">ACM SIGCHI Conference on Human Factors in Computing Systems</a>" (CHI'12). Abstracts and videos below.<br />
<br />
Abstract I (ETRA)<br />
Remote pan-and-zoom control for the exploration of large information spaces is of interest for various application areas, such as browsing through medical data in sterile environments or investigating geographic information systems on a distant display. In this context, considering a user's visual attention for pan-and-zoom operations could be of interest. In this paper, we investigate the potential of gaze-supported panning in combination with different zooming modalities: (1) a mouse scroll wheel, (2) tilting a handheld device, and (3) touch gestures on a smartphone. Thereby, it is possible to zoom in at a location a user currently looks at (i.e., gaze-directed pivot zoom). These techniques have been tested with Google Earth by ten participants in a user study. While participants were fastest with the already familiar mouse-only base condition, the user feedback indicates a particularly high potential of the gaze-supported pivot zooming in combination with a scroll wheel or touch gesture.
<br />
<br />
<br />
<center>
<iframe allowfullscreen="" frameborder="0" height="380" src="http://www.youtube.com/embed/vamycFeMzmg" width="600"></iframe> </center><center><i>To be presented at the ETRA12.</i></center><br />
<div>
<br /></div>
Abstract II<br />
Since eye gaze may serve as an efficient and natural input for steering in virtual 3D scenes, we investigate the design of eye gaze steering user interfaces (UIs) in this paper. We discuss design considerations and propose design alternatives based on two selected steering approaches differing in input condition (discrete vs. continuous) and velocity selection (constant vs. gradient-based). The proposed UIs have been iteratively advanced based on two user studies with twelve participants each. In particular, the combination of continuous and gradient-based input shows a high potential, because it allows for gradually changing the moving speed and direction depending on a user's point-of-regard. This has the advantage of reducing overshooting problems and dwell-time activations. We also investigate discrete constant input for which virtual buttons are toggled using gaze dwelling. As an alternative, we propose the Sticky Gaze Pointer as a more flexible way of discrete input.
<br />
<br />
<br />
<center>
<iframe allowfullscreen="" frameborder="0" height="380" src="http://www.youtube.com/embed/HcIWxXEY6Ec" width="600"></iframe></center><center><i>To be presented at the ETRA12.</i>
</center>
<br />
<br />
Abstract III (CHI)<br />
While eye tracking has a high potential for fast selection tasks, it is often regarded as error-prone and unnatural, especially for gaze-only interaction. To improve on that, we propose gaze-supported interaction as a more natural and effective way combining a user's gaze with touch input from a handheld device. In particular, we contribute a set of novel and practical gaze-supported selection techniques for distant displays. Designed according to the principle gaze suggests, touch confirms they include an enhanced gaze-directed cursor, local zoom lenses and more elaborated techniques utilizing manual fine positioning of the cursor via touch. In a comprehensive user study with 24 participants, we investigated the potential of these techniques for different target sizes and distances. All novel techniques outperformed a simple gaze-directed cursor and showed individual advantages. In particular those techniques using touch for fine cursor adjustments (MAGIC touch) and for cycling through a list of possible close-to-gaze targets (MAGIC tab) demonstrated a high overall performance and usability.
<br />
<br />
<br />
<center><iframe allowfullscreen="" frameborder="0" height="380" src="http://www.youtube.com/embed/cTM_AKk1Lrw" width="600"></iframe> </center><center><i>To be presented at the CHI12.
</i></center>Martin Tallhttp://www.blogger.com/profile/16023736349313969571noreply@blogger.com0tag:blogger.com,1999:blog-5651110511478705854.post-45806057657443277622012-01-10T17:35:00.000-08:002012-01-10T17:35:10.789-08:00EyeTech EyeOnA video from <a href="http://www.eyetechds.com/" target="_blank">EyeTech</a> that features Michael who suffers from <a href="http://www.ncbi.nlm.nih.gov/pubmedhealth/PMH0002406/" target="_blank">Thoracic Outlet Syndrome</a> (TOS). Great little clip that shows what computer control without gaze-adapted interfaces comes down to. Luckily Michael can use voice recognition software for typing, text input using eye movements alone is a cumbersome process (<a href="http://www.eyetechds.com/eye-tracking-mouse-helps-business-professional-suffering-from-thoracic-outlet-syndrome" target="_blank">source</a>).
<br />
<br />
<br />
<center>
<iframe allowfullscreen="" frameborder="0" height="315" src="http://www.youtube.com/embed/vV3atqhTfo8" width="560"></iframe></center>Martin Tallhttp://www.blogger.com/profile/16023736349313969571noreply@blogger.com1tag:blogger.com,1999:blog-5651110511478705854.post-3530080986115578682011-12-12T10:17:00.001-08:002012-10-12T09:30:17.405-07:00Senseye, eye control for mobile devices<br />
<center>
<iframe allowfullscreen="" frameborder="0" height="315" src="http://www.youtube.com/embed/lVCpUmLinXM" width="560"></iframe>
<br /></center>
<ul>
<li><span style="font-size: small;"><a href="http://www.senseye.net/">Senseye website</a></span></li>
<li><span style="font-size: small;">The Next Web: <a href="http://thenextweb.com/mobile/2011/12/02/senseye-will-let-you-control-your-mobile-phone-with-your-eyes/">Senseye will let you control your mobile phone with your eyes</a></span></li>
<li><span style="font-size: small;">The Blaze: <a href="http://www.theblaze.com/stories/new-meaning-for-hands-free-control-your-phone-with-a-just-a-shifty-eye/">New meaning for hands free control </a></span></li>
</ul>
Martin Tallhttp://www.blogger.com/profile/16023736349313969571noreply@blogger.com0