Maryam Sadeghi, a Masters student at the
Medical Image Analysis Lab at the Simon Fraser University in Canada presents an interesting paper on using eye tracking for gaze driven image segmentation. The research has been performed in cooperation with
Geoffry Thien (Ph.D student),
Dr. Hamarneh and
Stella Atkins (principal investigators). More information is to be published on
this page. Geoffry Thien completed his M.Sc thesis on gaze interaction in March under the title "Building Interactive Eyegaze Menus for Surgery" (
abstract) unfortunately I have not been able to located a electronic copy of that document.
Abstract
"This paper explores a novel approach to interactive user-guided image segmentation, using eyegaze information as an input. The method includes three steps: 1) eyegaze tracking for providing user input, such as setting object and background seed pixel selection; 2) an optimization method for image labeling that is constrained or affected by user input; and 3) linking the two previous steps via a graphical user interface for displaying the images and other controls to the user and for providing real-time visual feedback of eyegaze and seed locations, thus enabling the interactive segmentation procedure. We developed a new graphical user interface supported by an eyegaze tracking monitor to capture the user's eyegaze movement and fixations (as opposed to traditional mouse moving and clicking). The user simply looks at different parts of the screen to select which image to segment, to perform foreground and background seed placement and to set optional segmentation parameters. There is an eyegaze-controlled "zoom" feature for difficult images containing objects with narrow parts, holes or weak boundaries. The image is then segmented using the random walker image segmentation method. We performed a pilot study with 7 subjects who segmented synthetic, natural and real medical images. Our results show that getting used the new interface takes about only 5 minutes. Compared with traditional mouse-based control, the new eyegaze approach provided a 18.6% speed improvement for more than 90% of images with high object-background contrast. However, for low contrast and more difficult images it took longer to place seeds using the eyegaze-based "zoom" to relax the required eyegaze accuracy of seed placement."
Download paper as pdf.The custom interface is used to place backgound (red) and object (green) seeds which are used in the segmentation process. The custom fixation detection algorithm triggers a mouse click to the gaze position, if 20 of the previous 30 gaze samples lies within a a 50 pixel radius.
The results indicate a certain degree of feasibility for gaze assisted segmentation, however real-life situations often contain more complex images where borders of objects are less defined. This is also indicated in the results where the CT brain scan represents the
difficult category. For an initial study the results are interesting and it's likely that we'll see more of gaze interaction within domain specific applications in a near future.
- Maryam Sadeghi, Geoff Tien, Ghassan Hamarneh, and Stella Atkins. Hands-free Interactive Image Segmentation Using Eyegaze. In SPIE Medical Imaging 2009: Computer-Aided Diagnosis. Proceedings of the SPIE, Volume 7260 (pdf)