Showing posts with label conference. Show all posts
Showing posts with label conference. Show all posts

Monday, June 6, 2011

Proceedings from Novel Gaze-Controlled Applications 2011 online

The proceedings from the Novel Gaze-Controlled Applications 2011 conference are now available online. The conference that took place at the Blekinge Institute of Technology in Sweden during May 26-27 presented 11 talks covering a wide range of topics from gaming and gaze interaction to eye tracking solutions. Unfortunately I was unable to attend but luckily I'll have a couple of days interesting reading ahead. Kudos to Veronica Sundstedt and Charlotte Sennersten for organizing the event.
  • Gaze and voice controlled drawing
    Jan van der Kamp, Veronica Sundstedt
    Full text: PDF Online

  • Eye tracking within the packaging design workflow: interaction with physical and virtual shelves
    Chip Tonkin, Andrew D. Ouzts, Andrew T. Duchowski
    Full text: PDF Online

Monday, May 2, 2011

1st International Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction

During the UbiComp 2011 in Beijing in September the 1st International Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction (PETMEI) will be held. Keynote speaker is Jeff B. Pelz who has considerable experience with eye tracking during natural tasksThe call for paper is out, see details below.
"Recent developments in mobile eye tracking equipment and automated eye movement analysis point the way toward unobtrusive eye-based human-computer interfaces that are pervasively usable in everyday life. We call this new paradigm pervasive eye tracking – continuous eye monitoring and analysis 24/7. The potential applications for the ability to track and analyze eye movements anywhere and anytime call for new research to further develop and understand visual behaviour and eye-based interaction in daily life settings. PETMEI 2011 will focus on pervasive eye tracking as a trailblazer for mobile eye-based interaction and eye-based context-awareness. We provide a forum for researchers from human-computer interaction, context-aware computing, and eye tracking to discuss techniques and applications that go beyond classical eye tracking and stationary eye-based interaction. We want to stimulate and explore the creativity of these communities with respect to the implications, key research challenges, and new applications for pervasive eye tracking in ubiquitous computing. The long-term goal is to create a strong interdisciplinary research community linking these fields together and to establish the workshop as the premier forum for research on pervasive eye tracking."
Important Dates
  • Paper Submission: May 30, 2011
  • Notification of Acceptance: June 27, 2011
  • Camera-ready due: July 11, 2011
  • Workshop: September 18, 2011


Topics
Topics of interest cover computational methods, new applications and use cases, as well as eye tracking technology for pervasive eye tracking and mobile eye-based interaction. Topics of interest include, but are not limited to:


Methods
  • Computer vision tools for face, eye detection and tracking
  • Pattern recognition/machine learning for gaze and eye movement analysis
  • Integration of pervasive eye tracking and context-aware computing
  • Real-time multi-modality sensor fusion
  • Techniques for eye tracking on portable devices
  • Methods for long-term gaze and eye movement monitoring and analysis
  • Gaze modeling for development of conversational agents
  • Evaluation of context-aware systems and interfaces
  • User studies on impact of and user experience with pervasive eye tracking
  • Visual and non-visual feedback for eye-based interfaces
  • Interaction techniques including multimodal approaches
  • Analysis and interpretation of attention in HCI
  • Dual and group eye tracking
Applications
  • Mobile eye-based interaction with public displays, tabletops, and smart environments
  • Eye-based activity and context recognition
  • Pervasive healthcare, e.g. mental health monitoring or rehabilitation
  • Autism research
  • Daily life usability studies and market research
  • Mobile attentive user interfaces
  • Security and privacy for pervasive eye tracking systems
  • Eye tracking in automotive research
  • Eye tracking in multimedia research
  • Assistive systems, e.g. mobile eye-based text entry
  • Mobile eye tracking and interaction for augmented and virtual reality
  • Eye-based human-robot and human-agent interaction
  • Cognition-aware systems and user interfaces
  • Human factors in mobile eye-based interaction
  • Eye movement measures in affective computing
Technologies
  • New devices for portable and wearable eye tracking
  • Extension of existing systems for mobile interaction
See the submission details for more information. 

Monday, April 18, 2011

AutomotiveUI'11 - 3rd International Conference On Automotive User Interfaces and Interactive Vehicular Applications

"In-car interactive technology is becoming ubiquitous and cars are increasingly connected to the outside world. Drivers and passengers use this technology because it provides valuable services. Some technology, such as collision warning systems, assists drivers in performing their primary in-vehicle task (driving). Other technology provides information on myriad subjects or offers entertainment to the driver and passengers.

The challenge that arises from the proliferation of in-car devices is that they may distract drivers from the primary task of driving, with possibly disastrous results. Thus, one of the major goals of this conference is to explore ways in which in-car user interfaces can be designed so as to lessen driver distraction while still enabling valuable services. This is challenging, especially given that the design of in-car devices, which was historically the responsibility of car manufacturers and their parts suppliers, is now a responsibility shared among a large and ever-changing group of parties. These parties include car OEMs, Tier 1 and Tier 2 suppliers of factory-installed electronics, as well as the manufacturers of hardware and software that is brought into the car, for example on personal navigation devices, smartphones, and tablets.

As we consider driving safety, our focus in designing in-car user interfaces should not be purely on eliminating distractions. In-car user interfaces also offer the opportunity to improve the driver¹s performance, for example by increasing her awareness of upcoming hazards. They can also enhance the experience of all kinds of passengers in the car. To this end, a further goal of AutomotiveUI 2011 is the exploration of in-car interfaces that address the varying needs of different types of users (including disabled drivers, elderly drivers or passengers, and the users of rear-seat entertainment systems). Overall our goal is to advance the state of the art in vehicular user experiences, in order to make cars both safer and more enjoyable places to spend time." http://www.auto-ui.org



Topics include, but are not limited to:
* new concepts for in-car user interfaces
* multimodal in-car user interfaces
* in-car speech and audio user interfaces
* text input and output while driving
* multimedia interfaces for in-car entertainment
* evaluation and benchmarking of in-car user interfaces
* assistive technology in the vehicular context
* methods and tools for automotive user interface research
* development methods and tools for automotive user interfaces
* automotive user interface frameworks and toolkits
* detecting and estimating user intentions
* detecting/measuring driver distraction and estimating cognitive load
* biometrics and physiological sensors as a user interface component
* sensors and context for interactive experiences in the car
* user interfaces for information access (search, browsing, etc.) while driving
* user interfaces for navigation or route guidance
* applications and user interfaces for inter-vehicle communication
* in-car gaming and entertainment
* different user groups and user group characteristics
* in-situ studies of automotive user interface approaches
* general automotive user experience research
* driving safety research using real vehicles and simulators
* subliminal techniques for workload reduction



SUBMISSIONS
AutomotiveUI 2011 invites submissions in the following categories:

* Papers (Submission Deadline: July 11th, 2011)
* Workshops (Submission Deadline: July 25th, 2011)
* Posters & Interactive Demos (Submission Deadline: Oct. 10th, 2011)
* Industrial Showcase (Submission Deadline:  Oct. 10th, 2011)

For more information on the submission categories please check http://www.auto-ui.org/11/submit.php

Thursday, January 13, 2011

Call for papers: UBICOMM 2011

"The goal of the International Conference on Mobile Ubiquitous Computing, Systems, Services and Technologies, UBICOMM 2011, is to bring together researchers from the academia and practitioners from the industry in order to address fundamentals of ubiquitous systems and the new applications related to them. The conference will provide a forum where researchers shall be able to present recent research results and new research problems and directions related to them. The conference seeks contributions presenting novel research in all aspects of ubiquitous techniques and technologies applied to advanced mobile applications."   All tracks/topics are open to both research and industry contributions. More info.
Tracks:
  • Fundamentals
  • Mobility
  • Information Ubiquity
  • Ubiquitous Multimedia Systems and Processing
  • Wireless Technologies
  • Web Services
  • Ubiquitous networks
  • Ubiquitous devices and operative systems
  • Ubiquitous mobile services and protocols
  • Ubiquitous software and security
  • Collaborative ubiquitous systems
  • User and applications
Deadlines:
  • Submission (full paper) June 20, 2011
  • Notification July 31, 2011
  • Registration August 15, 2011
  • Camera ready August 20, 2011

Monday, January 10, 2011

Call for papers: ACIVS 2011

Acivs 2011 is a conference focusing on techniques for building adaptive, intelligent, safe and secure imaging systems. Acivs 2011 consists of four days of lecture sessions, both regular (25 mns) and invited presentations, poster sessions. The conference will take place in the Het Pand, Ghent, Belgium on Aug. 22-25 2011.

Topics

  • Vision systems, including multi-camera systems
  • Image and Video Processing (linear/non-linear filtering and enhancement, restoration, segmentation, wavelets and multiresolution, Markovian techniques, color processing, modeling, analysis, interpolation and spatial transforms, motion, fractals and multifractals, structure from motion, information geometry)
  • Pattern Analysis (shape analysis, data and image fusion, pattern matching, neural nets, learning, grammatical techniques) and Content-Based Image Retrieval
  • Remote Sensing (techniques for filtering, enhancing, compressing, displaying and analyzing optical, infrared, radar, multi- and hyperspectral airborne and spaceborne images)
  • Still Image and Video Coding and Transmission (still image/video coding, model-based coding, synthetic/natural hybrid coding, quality metrics, image and video protection, image and video databases, image search and sorting, video indexing, multimedia applications)
  • System Architecture and Performance Evaluation (implementation of algorithms, GPU implementation, benchmarking, evaluation criteria, algorithmic evaluation)
Proceedings
The proceedings of Acivs 2011 will be published by Springer Verlag in the Lecture Notes in Computer Science series. LNCS is published, in parallel to the printed books, in full-text electronic form via Springer Verlag's internet platform

Deadlines
February 11, 2011Full paper submission
April 15, 2011Notification of acceptance
May 15, 2011Camera-ready papers due
May 15, 2011Registration deadline for authors of accepted papers
June 30, 2011Early registration deadline
Aug. 22-25 2011Acivs 2011

Tuesday, January 26, 2010

ETRA 2k10 Program Announced!

The awaited program for this years Eye Tracking Research and Applications symposium held in Austin, Texas, 22nd-24th March has been announced. The biennial get-together for leading research on eye movement research targeting computer scientists, engineers and behavioral researchers, is organized in conjunction with the European Communication By Gaze Interaction (COGAIN) association. Which emphasizes a certain focus on gaze-based interaction for individuals with physical motor control disabilities. This years keynote will be given by Scott MacKenzie, Associate Professor of Computer Science and Engineering at York University, Canada.

The long papers section contains 18 entries reflecting the various areas of eye gaze research, from eye tracking, data analysis, visualization, cognitive studies, and interaction & control. In addition, the long list of short papers and a full poster section will ensure a worthwhile event for anyone interested in eye movement related research.

Update: Official detailed program now available.
Update: The papers are now available online.

Looking forward to meeting you there!

Eye tracking & technical achievements

Full papers:
  • Homography Normalization for Robust Gaze Estimation in Uncalibrated Setups
    Dan Witzner Hansen, Javier San Agustin, and Arantxa Villanueva.
    Full paper.

  • Head-Mounted Eye-Tracking of Infants’ Natural Interactions: A New Method
    John Franchak, Kari Kretch, Kasey Soska, Jason Babcock, and Karen Adolph. Full paper.

  • User-Calibration-Free Remote Gaze Estimation System
    Dmitri Model and Moshe Eizenman. Full paper.
Short papers:
  • The Pupillometric Precision of a Remote Video Eye Tracker
    Jeff Klingner. Short paper.

  • Biometric Identification via an Oculomotor Plant Mathematical Model
    Oleg Komogortsev, Sampath Jayarathna, Cecilia Aragon, and Mechehoul Mahmoud. Short paper.

  • SemantiCode: Using Content Similarity and Database-driven Matching to Code Wearable Eyetracker Gaze Data
    Daniel Pontillo, Thomas Kinsman, and Jeff Pelz
    . Short paper.

  • Gaze Estimation Method based on an Aspherical Model of the Cornea: Surface of Revolution about the Optical Axis of the Eye
    Takashi Nagamatsu, Yukina Iwamoto, Junzo Kamahara, Naoki Tanaka, and Michiya Yamamoto. Short paper.

  • User-calibration-free Gaze Tracking with Estimation of the Horizontal Angles between the Visual and the Optical Axes of Both Eyes
    Takashi Nagamatsu, Ryuichi Sugano, Yukina Iwamoto, Junzo Kamahara, and Naoki Tanaka. Short paper.

    Posters:

  • Evaluation of a Low-Cost Open-Source Gaze Tracker
    John Hansen, Dan Witzner Hansen, Emilie Møllenbach, Martin Tall, Javier San Agustin, Maria Barrett, and Henrik Skovsgaard. Poster.

  • Measuring Vergence Over Stereoscopic Video with a Remote Eye Tracker
    Brian Daugherty, Andrew Duchowski, Donald House, and Celambarasan Ramasamy. Poster.

  • Learning Relevant Eye Movement Feature Spaces Across Users.
    Zakria Hussain, Kitsuchart Pasupa, and John Shawe-Taylor
    . Poster.

  • Interactive Interface for Remote Administration of Clinical Tests Based on Eye Tracking
    Alberto Faro, Daniela Giordano, Concetto Spampinato, Davide De Tommaso, and Simona Ullo. Poster.

  • Robust Optical Eye Detection During Head Movement
    Jeffrey Mulligan and Kevin Gabayan. Poster.

  • Estimating 3D Point-of-regard and Visualizing Gaze Trajectories under Natural Head Movements
    Kentaro Takemura, Yuji Kohashi, Tsuyoshi Suenaga, Jun Takamatsu, and Tsukasa Ogasawara. Poster.

  • Eye Tracking with the Adaptive Optics Scanning Laser Ophthalmoscope
    Scott Stevenson, Austin Roorda, and Girish Kumar. Poster.

  • A Depth Compensation Method for Cross-Ratio Based Eye Tracking
    Flavio L. Coutinho and Carlos H. Morimoto. Poster.

  • Pupil Center Detection in Low Resolution Images
    Detlev Droege and Dietrich Paulus. Poster.

  • Development of Eye-Tracking Pen Display Based on Stereo Bright Pupil Technique
    Michiya Yamamoto, Takashi Nagamatsu, and Tomio Watanabe. Poster.

  • The Use of Eye Tracking for PC Energy Management
    Vasily Moshnyaga

  • Listing's and Donders' Laws and the Estimation of the Point-of-Gaze
    Elias Guestrin and Moshe Eizenman

Data processing & eye movement detection

Full papers:
  • A Vector-Based, Multi-Dimensional Scanpath Similiarty Measure
    Halszka Jarodzka, Kenneth Holmqvist, and Marcus Nyström.
    Full paper.

  • Match-Moving for Area-Based Analysis of Eye Movements in Natural Tasks
    Andrew Duchowski, Wayne Ryan, Ellen Vincent, and Dina Battisto. Full paper.

  • Fixation-Aligned Pupillary Response Averaging
    Jeff Klingner. Full paper.

    Posters:

  • Qualitative and Quantitative Scoring and Evaluation of the Eye Movement Classification Algorithms.
    Oleg Komogortsev, Sampath Jayarathna, Do Hyong Koh, and Sandeep Munikrishne Gowda
    . Poster.

  • Group-Wise Similarity and Classification of Aggregate Scanpaths
    Thomas Grindinger, Andrew Duchowski, and Michael Sawyer. Poster.

Visualization

Full papers:
  • Visual Scanpath Representation
    Joseph Goldberg and Jonathan Helfman. Full paper.

  • Scanpath Comparison Revisited
    Andrew Duchowski, Jason Driver, Sheriff Jolaoso, Beverly Ramey, Ami Robbins, and William Tan. Full paper.

  • Scanpath Clustering and Aggregation
    Joseph Goldberg and Jonathan Helfman. Full paper.

  • Space-Variant Spatio-Temporal Filtering of Video for Gaze Visualization and Perceptual Learning
    Michael Dorr, Halszka Jarodzka, and Erhardt Barth.
    Full paper.

    Posters:

  • Adapted Gaze Visualizations for Three-dimensional Virtual Environments
    Sophie Stellmach, Lennart Nacke, and Raimund Dachselt. Poster.

  • Visual Span and Other Parameters for the Generation of Heatmaps
    Pieter Blignaut. Poster.

Cognitive studies & HCI

Full papers:
  • Interpretation of Geometric Shapes - An Eye Movement Study
    Miquel Prats, Iestyn Jowers, Nieves Pedreira, Steve Garner, and Alison McKay. Full paper.

  • Understanding the Benefits of Gaze Enhanced Visual Search
    Pernilla Qvarfordt, Jacob Biehl, Gene Golovchinksy, and Tony Dunnigan. Full paper.

  • Image Ranking with Implicit Feedback from Eye Movements
    David Hardoon and Kitsuchart Pasupa. Full paper.

  • How the Interface Design Influences Users’ Spontaneous Trustworthiness Evaluations of Web Search Results: Comparing a List and a Grid Interface
    Yvonne Kammerer and Peter Gerjets. Full paper.

    Short papers:

  • Have You Seen Any of These Men? Looking at Whether Eyewitnesses Use Scanpaths to Recognize Suspects in Photo Lineups
    Sheree Josephson and Michael Holmes. Short paper.

  • Contingency Evaluation of Gaze-Contingent Displays for Real-Time Visual Field Simulations
    Margarita Vinnikov and Robert Allison. Short paper.

  • Estimation of Viewer's Response for Contextual Understanding of Tasks of Using Features of Eye-movements
    Minoru Nakayama and Yuko Hayashi. Short paper.
Posters:
  • Gaze-based Web Search: The Impact of Interface Design on Search Result Selection
    Yvonne Kammerer and Wolfgang Beinhauer. Poster

  • Visual Search in the (Un)Real World: How Head-Mounted Displays Affect Eye Movements, Head Movements and Target Detection
    Tobit Kollenberg, Alexander Neumann, Dorothe Schneider, Tessa-Karina Tews, Thomas Hermann, Helge Ritter, Angelika Dierker, and Hendrik Koesling. Poster.

  • Quantification of Aesthetic Viewing Using Eye-Tracking Technology: The Influence of Previous Training in Apparel Design
    Juyeon Park, Marilyn DeLong, and Emily Woods. Poster.

  • Visual Attention for Implicit Relevance Feedback in a Content Based Image Retrieval
    Concetto Spampinato, Alberto Faro, Daniela Giordano, and Carmelo Pino. Poster.

  • Eye and Pointer Coordination in Search and Selection Tasks
    Hans-Joachim Bieg, Lewis Chuang, Roland Fleming, Harald Reiterer, and Heinrich Bülthoff. Poster.

  • Natural Scene Statistics at Stereo Fixations
    Yang Liu, Lawrence Cormack, and Alan Bovik. Poster.

  • Measuring Situation Awareness of Surgeons in Laparoscopic Training
    Geoffrey Tien, Bin Zheng, Stella Atkins, and Colin Swindells

  • Saliency-Based Decision Support
    Roxanne Canosa. Poster.

  • Inferring Object Relevance from Gaze in Dynamic Scenes
    Melih Kandemir, Veli-Matti Saarinen, and Samuel Kaski. Poster.

  • Using Eye Tracking to Investigate Important Cues for Representative Creature Motion
    Meredith McLendon, Ann McNamara, Tim McLaughlin, and Ravindra Dwivedi

  • Estimating Cognitive Load Using Remote Eye Tracking in a Driving Simulator
    Oskar Palinko, Andrew Kun, Alexander Shyrokov, and Peter Heeman

Computer and machine control

Full papers:
  • Alternatives to Single Character Entry and Dwell Time Selection on Eye Typing
    Mario Urbina and Anke Huckauf.
    Full paper.

  • Designing Gaze Gestures for Gaming: an Investigation of Performance
    Howell Istance, Aulikki Hyrskykari, Lauri Immonen, Santtu Mansikkamaa, and Stephen Vickers. Full paper.

  • ceCursor, a Contextual Eye Cursor for General Pointing in Windows Environments
    Marco Porta, Alice Ravarelli, and Giovanni Spagnoli. Full paper.

  • BlinkWrite2: An Improved Text Entry Method Using Eye Blinks
    Behrooz Ashtiani and Scott MacKenzie
    . Full paper.

    Short papers:

  • Eye Movement as an Interaction Mechanism for Relevance Feedback in a Content-Based Image Retrieval System
    Yun Zhang, Hong FU, Zhen Liang, Zheru Chi, and Dagand Feng. Short paper.

  • Gaze Scribing in Physics Problem Solving
    David Rosengrant. Short paper.

  • Content-based Image Retrieval Using a Combination of Visual Features and Eye Tracking Data
    Zhen Liang, Hong FU, Yun Zhang, Zheru Chi, and Dagan Feng. Short paper.

  • Context Switching for Fast Key Selection in Text Entry Applications
    Carlos H. Morimoto and Arnon Amir. Short paper.

    Posters

  • Small-Target Selection with Gaze Alone
    Henrik Skovsgaard, Julio Mateo, John Flach, and John Paulin Hansen. Poster
  • What You See is Where You Go: Testing a Gaze-Driven Power Wheelchair for Individuals with Severe Multiple Disabilities
    Erik Wästlund, Kay Sponseller, and Ola Pettersson. Poster.

  • Single Gaze Gestures
    Emilie Møllenbach, Alastair Gale, Martin Lillholm, and John Paulin Hansen. Poster.

  • Using Vision and Voice to Create a Multimodal Interface for Microsoft Word 2007
    Tanya Beelders and Pieter Blignaut. Poster.

  • Towards Task-Independent Person Authentication Using Eye Movement Signals
    Tomi Kinnunen, Filip Sedlak, and Roman Bednarik

  • An Open Source Eye-gaze Interface: Expanding the Adoption of Eye-Gaze in Everyday Applications
    Craig Hennessey and Andrew Duchowski. Poster.

  • Pies with EYEs: The Limits of Hierarchical Pie Menus in Gaze Control
    Mario Urbina, Maike Lorenz, and Anke Huckauf. Poster.
  • Low-Latency Combined Eye and Head Tracking System for Teleoperating a Robotic Head in Real-Time
    Stefan Kohlbecher, Klaus Bartl, Stanislavs Bardins, and Erich Schneider. Poster.

Tuesday, December 8, 2009

Scandinavian Workshop on Applied Eye-tracking (SWAET) 2010.

The first call for papers for the annual Scandinavian Workshop on Applied Eye-Tracking (SWAET) organized by Kenneth Holmqvist and the team at the Lund University Humanities laboratory was just announced. The SWAET 2010 will be held in Lund, Sweden between May 5-7th. The invited speaker is Gerry Altmann (blog) from the Dept. of Psychology at University of York, UK and Ignace Hooge (s1, s2) from the Dept. of Psychology at Utrecht University, Holland.

Visit the SWAET website for more information.

Update: Download the abstracts (pdf, 1Mb)

Wednesday, October 21, 2009

Medical Image Perception Society 2009 - Day two

Session 6. Performance Measurement II. Chair: Matthew Freedman, MD, MBA
  • Coding of FDG Intensity as a 3-D Rendered Height Mapping to Improve Fusion Display of Co-Registered PET-CT Images. RM Shah, C Wood, YP Hu, & LS Zuckier
  • Estimation of AUC from Normally Distributed Rating Data with Known Variance Ratio. A Wunderlich & F Noo
  • Using the Mean-to-Variance Ratio as a Diagnostic for Unacceptably Improper Binormal ROC Curves. SL Hillis & KS Berbaum
Session 7. Performance Measurement II. Chair: Stephen Hillis, PhD
  • BI-RADS Data Should Not be Used to Estimate ROC Curves. Y Jiang & CE Metz

  • Estimating the utility of screening mammography in large clinical studies. CK Abbey, JM Boone, & MP Eckstein

  • Issues Related to the Definition of Image Contrast, DL Leong & PC Brennan
Session 8. Models of Perceptual processing. Chair: Yulei Jiang, PhD
  • Channelized Hotelling Observers for Detection Tasks in Multi-Slice Images. L Platiša, B Goossens, E Vansteenkiste, A Badano & W Philips

  • Channelized Hotelling observers adapted to irregular signals in breast tomosynthesis detection tasks. I Diaz, P Timberg, CK Abbey, MP Eckstein, FR Verdun, C Castella, FO Bochud

  • Detecting Compression Artifacts in Virtual Pathology Images Using a Visual Discrimination Model. J Johnson & EA Krupinski

  • Automatic MRI Acquisition Parameters Optimization Using HVS-Based Maps. J Jacobsen, P Irarrázabal, & C Tejos

  • Parametric Assessment of Lesion Detection Using a Pre-whitened Matched Filter on Projected Breast CT Images. N Packard, CK Abbey, & JM Boone

  • Model Observers for Complex Discrimination Tasks: Deployment Assessment of Multiple Coronary Stents. S Zhang, CK Abbey, X Da, JS Whiting, & MP Eckstein
Session 9. Special Invited Session on Neuroscience and Medical Image Perception. Chair: Miguel Eckstein, PhD
  • Decoding Information Processing When Attention Fails: An Electrophysiological Approach. B Giesbrecht
  • Some Neural Bases of Radiological Expertise. SA Engel

Monday, June 1, 2009

COGAIN 2009 Proceedings now online

There is little reason to doubt the vitality of the COGAIN network. This years proceedings presents an impressive 18 papers spread out over one hundred pages. It covers a wide range of areas from low-cost eye tracking, text entry, gaze input for gaming, multimodal interaction to environment control, clinical assessments and case studies. Unfortunately I was unable to attend the event this year (recently relocated) but with the hefty proceedings being available online there is plenty of material to be read through (program and links to authors here) Thanks to Arantxa Villanuevua, John Paulin Hansen and Bjarne Kjaer Ersboll for the editorial effort.

Tuesday, May 12, 2009

BBC News: The future of gadget interaction

Dan Simmons at BBC reports on future technologies from the Science Beyond Fiction 2009 conference in Prague. The news headline includes a section on the GazeCom project who won the 2nd prize for their exhibit "Gaze-contingent displays and interaction". Their website hosts additional demonstrations.

"Gaze tracking is well-established and has been used before now by online advertisers who use it to decide the best place to put an advert. A novel use of the system tracks someone's gaze and brings into focus the area of a video being watched by blurring their peripheral vision.In the future, the whole image could also be panned left or right as the gaze approaches the edge of the screen. Film producers are interested in using the system to direct viewers to particular parts within a movie. However, interacting with software through simply looking will require accurate but unobtrusive eye tracking systems that, so far, remain on the drawing board... The European Commission (EC) is planning to put more cash into such projects. In April it said it would increase its investment in this field from 100m to 170m euros (£89m-£152m) by 2013. " (BBC source ) More information about the EC CORDIS : ICT program.

External link. The BBC reported Dan Simmons tests a system designed to use a driver's peripheral vision to flag up potential dangers on the road. It was recorded at the Science Beyond Fiction conference in Prague.

The GazeCom project involves the following partners:

ETRA 2010 Call for papers

ETRA 2010 will be the sixth biennial symposium in a series that focuses on all aspects of eye movement research across a wide range of disciplines. The goal of ETRA is to bring together computer scientists, engineers and behavioral scientists in support of a common vision of enhancing eye tracking research and applications. ETRA 2010 is being organized in conjunction with the European Communication by Gaze Interaction (COGAIN) research network that specializes in gaze-based interaction for the benefit of people with physical disabilities.

Update: List of accepted and presented papers.

Symposium Themes
  • Advances in Eye Tracking Technology and Data Analysis
    Eye tracking systems, calibration algorithms, data analysis techniques, noise reduction, predictive models, 3D POR measurement, low cost and natural light systems.
  • Visual Attention and Eye Movement Control
    Studies of eye movements in response to natural stimuli, driving studies, web use and usability studies.
  • Eye Tracking Applications
    Gaze-contingent displays, attentive user interfaces, gaze-based interaction techniques, security systems, multimodal interfaces, augmented and mixed reality systems, ubiquitous computing.
  • Special Theme: Eye Tracking and Accessibility
    Eye tracking has proved to be an effective means of making computers more accessible when the use of keyboards and mice is hindered by the task itself (such as driving), or by physical disabilities. We invite submissions that explore new methodological strategies, applications, and results that use eye tracking in assistive technologies for access to desktop applications, for environment and mobility control, and for gaze control of games and entertainment..
Two categories of submissions are being sought – Full Papers and Short Papers.
Full papers must be submitted electronically through the ETRA 2010 website and conform to the ACM SIGGRAPH proceedings category 2 format. Full papers submissions can have a maximum length of eight pages. Full papers submissions should be made in double-blind format, hiding authors’ names and affiliations and all references to the authors’ previous work. Those wishing to submit a full paper must submit an abstract in advance to facilitate the reviewing process. Accepted papers will be published in the ETRA 2010 proceedings, and the authors will give a 20 minute oral presentation of the paper at the conference.

Short papers may present work that has smaller scope than a full paper or may present late breaking results. These must be submitted electronically through the ETRA 2010 submission website and conform to the ACM SIGGRAPH proceedings category 3 format. Short paper submissions have a maximum length of four pages (but can be as short as a one-page abstract). Given the time constraints of this type of paper, submissions must be made in camera-ready format including authors' names and affiliations. Accepted submissions will be published in the ETRA 2010 proceedings. Authors will present a poster at the conference, and authors of the most highly rated submissions will give a 10 minute presentation of the paper in a Short Papers session. All submissions will be peer-reviewed by members of an international review panel and members of the program committee. Best Paper Awards will be given to the most highly ranked Full Papers and Short Papers.

Full Papers Deadlines
  • Sep. 30th, 2009 Full Papers abstract submission deadline
  • Oct. 7th, 2009 Full Papers submission deadline
  • Nov. 13th, 2009 Acceptance notification
Short Papers Deadlines
  • Dec. 2th, 2009 Short Papers submission deadline
  • Jan. 8th, 2010 Short Papers acceptance notification
  • Jan. 15th, 2010 All camera ready papers due
More information on the ETRA website.

Wednesday, May 6, 2009

COGAIN 2009 Program announced

This years Communication By Gaze Interaction conference is held on the 26th of May in Lyngby, Denmark in connection with the VisionDay (a four day event on computer vision). Registration for attending should be made on or before May 14th. Download program as pdf.

Update: the proceedings can be downloaded as pdf.


The program for May 26th
  • 08.00 Registration, exhibition, demonstrations, coffee, and rolls
SESSION I
  • 09.00 Welcome and introduction (Lars Pallesen, Rector @ DTU)
  • 09.10 Eye guidance in natural behaviour (B. W. Tatler)
  • 09.50 Achievements and experiences in the course of COGAIN (K. Raiha)
  • 10.30 Coffee, exhibition, demonstrations
SESSION II
  • 11.00 Joys and sorrows in communicating with gaze (A. Lykke-Larsen)
  • 11.30 An introduction to the 17 papers presented in the afternoon
  • 12.00 Lunch, exhibition, demonstrations, posters
SESSION III Track 1
SESSION III Track 2
14.50 Coffee, exhibition, demonstrations, posters

SESSION IV Track 1
  • 15.30 Gameplay experience in a gaze interaction game (L. Nacke, S. Stellmach, D. Sasse & C. A. Lindley)
  • 15.50 Select commands in 3D game environments by gaze gestures (S. Vickers, H. Istance & A. Hyrskykari)
  • 16.10 GazeTrain: A case study of an action oriented gaze-controlled game (L. F. Laursen & B. Ersbøll)
  • 16.30 Detecting Search and Rescue Targets in Moving Aerial Images using Eye-gaze (J. Mardell, M. Witkowski & R. Spence)
  • 16.50 Feasibility Study for the use of Eye-Movements in Estimation of Answer Correctness (M. Nakayama & Y. Hayashi)
SESSION IV Track 2
  • 15.30 Eye Tracker Connectivity (G. Daunys & V. Vysniauskas)
  • 15.50 SW tool supporting customization of eye tracking algorithms (P. Novák & O. Štepánková)
  • 16.10 Multimodal Gaze-Based Interaction (S. Trösterer & J. Dzaack)
  • 16.30 Gaze Visualization Trends and Techniques (S. Stellmach, L. Nacke, R. Dachselt & C. A. Lindley)
19.00 COGAIN2009 dinner at Brede Spisehus

Monday, September 15, 2008

COGAIN 2008 Proceedings now online




Contents

Overcoming Technical Challenges in Mobile and Other Systems
  • Off-the-Shelf Mobile Gaze Interaction
    J. San Agustin and J. P. Hansen, IT University of Copenhagen, Denmark
  • Fast and Easy Calibration for a Head-Mounted Eye Tracker
    C. Cudel, S Bernet, and M Basset, University of Haute Alsace, France
  • Magic Environment
    L. Figueiredo, T. Nunes, F. Caetano, and A. Gomes, ESTG/IPG, Portugal
  • AI Support for a Gaze-Controlled Wheelchair
    P. Novák, T. Krajník, L. Přeučil, M. Fejtová, and O. Štěpánková. Czech Technical University, Czech Republic)
  • A Comparison of Pupil Centre Estimation Algorithms
    D. Droege, C Schmidt, and D. Paulus University of Koblenz-Landau, Germany

Broadening Gaze-Based Interaction Techniques
  • User Performance of Gaze-Based Interaction with On-line Virtual Communities
    H. Istance, De Montfort University, UK, A. Hyrskykari, University of Tampere, Finland, S. Vickers, De Montfort University, UK and N. Ali, University of Tampere, Finland

  • Multimodal Gaze Interaction in 3D Virtual Environments
    E. Castellina and F. Corno, Politecnico di Torino, Italy
  • How Can Tiny Buttons Be Hit Using Gaze Only?
    H. Skovsgaard, J. P. Hansen, IT University of Copenhagen, Denmark. J. Mateo, Wright State University, Ohio, US
  • Gesturing with Gaze
    H. Heikkilä, University of Tampere, Finland
  • NeoVisus: Gaze Driven Interface Components
    M. Tall, Sweden

Focusing on the User: Evaluating Needs and Solutions
  • Evaluations of Interactive Guideboard with Gaze-Communicative Stuffed-Toy Robot
    T. Yonezawa, H. Yamazoe, A. Utsumi, and S. Abe, ATR Intelligent Robotics and Communications Laboratories, Japan
  • Gaze-Contingent Passwords at the ATM
    P. Dunphy, A. Fitch, and P. Oliver, Newcastle University, UK
  • Scrollable Keyboards for Eye Typing
    O Špakov and P. Majaranta, University of Tampere, Finland
  • The Use of Eye-Gaze Data in the Evaluation of Assistive Technology Software for Older People.
    S. Judge, Barnsley District Hospital Foundation, UK and S. Blackburn, Sheffield University, UK
  • A Case Study Describing Development of an Eye Gaze Setup for a Patient with 'Locked-in Syndrome' to Facilitate Communication, Environmental Control and Computer Access.
    Z. Robertson and M. Friday, Barnsley General Hospital, UK