WUD 2010 D.Miniotas - Gaze-Based Interaction

Download WUD 2010 D.Miniotas - Gaze-Based Interaction

Post on 10-Jul-2015

458 views

Category:

Technology

1 download

Embed Size (px)

TRANSCRIPT

<ul><li><p>Gaze-Based Interaction</p><p>Darius Miniotas</p><p>Dept. of Electronic Systems, VGTU</p><p>darius.miniotas@el.vgtu.lt</p></li><li><p>Gaze-Based Interaction: Why?*</p><p> The only option for disabledusers</p><p> An option for others in hands-busy situations</p><p> Eye movements</p><p>- are extremely fast</p><p>- are natural</p><p>- require little conscious effort</p><p>* Some of the information in these slides is based on the materials presented in a</p><p>tutorial at NordiCHI 2004 by Kari-Jouko Rih, Aulikki Hyrskykari and Pivi Majaranta</p></li><li><p>Demo from the German Research </p><p>Center for AI</p><p> Text 2.0 [2009] Imagine there were input devices which could allow text to know if and how it is being read </p><p>- how would this change the reading experience? </p><p>http://text20.net/node/4</p></li><li><p>Technological Challenges</p><p> Cost of equipment</p><p>- 2000 25 000 EUR</p><p>- mass production could lower the cost by an order of magnitude</p><p> Usability of equipment</p><p>- remote trackers convenient, but allow only small movements</p><p>- head-mounted trackers more accurate but obtrusive</p><p> Need for calibration</p><p>- for every user at the beginning of a tracking session</p><p>- often recalibration required during prolonged use</p></li><li><p>Types of Eye Tracking Applications</p><p> Off-line applications</p><p>- visualizing gaze data</p><p>- analyzing gaze behavior</p><p>- modifying images based on viewing data</p><p> On-line (interactive) applications</p><p>- command-based</p><p>- attentive</p></li><li><p>Command-based interaction: </p><p>challenges</p><p> Eyes are normally used for observation, not for control</p><p>- humans are not used to activating objects just by </p><p>looking at them</p><p> Gaze behaves very differently from other ways used for controlling computers (hands, voice)</p><p>- intentional control of eyes is difficult and stressful</p><p>- the gaze is easily attracted by external events</p><p>- precise control of eyes is difficult</p><p>- poorly implemented eye control can be extremely </p><p>annoying</p></li><li><p>Midas Touch Problem</p><p> Most of the time the eyes are used for obtaining information with no intent to initiate commands</p><p> Users are easily afraid of looking at the eye-active objects or areas of the window</p><p> Using eyes for commands requires development of new forms of interaction</p></li><li><p>Expanding Targets [CHI 2004]</p></li><li><p>Selecting Standard-Size Menu Items </p><p>[ICMI 2005]</p></li><li><p>Gaze-Aware Applications</p><p> Command-and-Controlapplications</p><p>- typing (conventional)</p><p>- typing (using gaze gestures)</p><p>- drawing</p><p>- other</p><p> Multimodal Applications</p><p> Gaze-Contingent Displays</p><p> Attentive Interfaces</p></li><li><p>Typing by Gaze</p><p> A typical eye typing system has</p><p>- an on-screen keyboard</p><p>- an eye tracker to record eye </p><p>movements</p><p>- a computer to analyze gaze </p><p>behavior</p><p> To type by gaze the user</p><p>- focuses on a letter</p><p>- gets feedback from the system</p><p>- selects the item in focus</p><p>EC Key, a typical keyboard</p></li><li><p>Compact Keyboard Layouts</p></li><li><p>Dasher</p><p>Demo: http://www.youtube.com/watch?v=0d6yIquOKQ0</p></li><li><p>Using Gaze Gestures for Typing</p></li><li><p>Drawing with the Eye [2003]</p></li><li><p>Other eye-controlled applications</p><p> e-mail</p><p> Internet browsing</p><p> accessing online libraries</p><p> games</p><p> interaction with online virtual communities</p></li><li><p>EyeScroll [2007]</p><p> gaze-enhanced scrolling allows for automatic, adaptive scrolling </p><p>on content being viewed on the </p><p>screen</p><p> supports multiple scrolling modes depending on the user's </p><p>preference and reading style</p><p> users can read the content as is scrolls smoothly or scrolls once </p><p>the user has reached the bottom </p><p>of the screen </p></li><li><p>EyePassword, EyeSaver [2007]</p><p> gaze-based password/pin entry:prevents shoulder surfing and </p><p>does not generate any keyboard </p><p>or mouse events more difficult to use standard event loggers</p><p> screen saver turned on when a user looks away from the </p><p>screen, off when the user looks </p><p>back at the screen</p></li><li><p>EyePhone [2010]</p><p> developed at Dartmouth College (USA)</p><p> tracks a persons eye relative to a phones screen </p><p> users activate applications by blinking</p><p>Demo: http://www.technologyreview.com/computing/25369/</p></li><li><p>Interactive Applications</p><p> Command-based interaction</p><p>- typing (conventional)</p><p>- typing (using gaze gestures)</p><p>- drawing</p><p> Gaze-aware interfaces</p><p>- multimodal input</p><p>- gaze-contingent displays</p><p>- attentive interfaces</p></li><li><p>Gaze-Aware Applications</p><p> Command-and-Controlapplications</p><p>- typing (conventional)</p><p>- typing (using gaze gestures)</p><p>- drawing</p><p>- other</p><p> Multimodal Applications</p><p> Gaze-Contingent Displays</p><p> Attentive Interfaces</p></li><li><p>Gaze as Mouse Accelerator [1999]</p><p> MAGIC pointing</p><p> Two strategies for warping</p><p>- always when the point of gaze moves (liberal)</p><p>- only after moving the mouse a little (cautious)</p><p> Empirical results</p><p>- liked by the users</p><p>- interaction was slightly slowed down by the cautious strategy, but the liberal strategy was faster than using just the mouse</p></li><li><p>Gaze + Hotkeys [2007]</p><p> performs basic mouse operation</p><p> reduces / eliminates dependency on the </p><p>mouse for most everyday </p><p>tasks such as surfing the </p><p>web</p><p> look-press-look-release action to allow for </p><p>increasingly accurate </p><p>selection</p><p>Demo: http://hci.stanford.edu/research/GUIDe/index.html</p></li><li><p>Gaze + Speech [2006]</p></li><li><p>Put-That-There [1982]</p><p> Multimodal input (speech, pointing gestures, gaze)</p><p> Eye gaze used for disambiguation (together with pointing)</p><p> Demo: http://www.poetv.com/video.php?vid=44316</p></li><li><p>Gaze-Aware Applications</p><p> Command-and-Controlapplications</p><p>- typing (conventional)</p><p>- typing (using gaze gestures)</p><p>- drawing</p><p>- other</p><p> Multimodal Applications</p><p> Gaze-Contingent Displays</p><p> Attentive Interfaces</p></li><li><p>Gaze-Aware Applications</p><p> Command-and-Controlapplications</p><p>- typing (conventional)</p><p>- typing (using gaze gestures)</p><p>- drawing</p><p>- other</p><p> Multimodal Applications</p><p> Gaze-Contingent Displays</p><p> Attentive Interfaces</p></li><li><p>iDict [2004]</p><p> automatically detects irregularities in reading </p><p>process</p><p> consults the embedded dictionaries and provides </p><p>assistance</p></li><li><p>Attentive Videoconferencing [1999]</p><p> Multiparty teleconferencing and document sharing system</p><p> Images rotate to show gaze direction (who is talking to whom)</p><p> Document lightspot (look at this reference)</p></li><li><p>PONG: The Attentive Robot [2001]</p><p> A robot that understands and reacts to human presence and visual communication messages</p><p> Detects when a human walks sufficiently close and then greets the person verbally and visually by displaying a </p><p>smile</p><p> Tries to mimic the users facial expressions</p></li><li><p>Attention Sensors: eyePLIANCES</p><p>Eye aRe glasses eyeContact sensor</p><p>Light fixture with </p><p>eyeContact sensor</p></li><li><p>Time for a demo: EyeChess</p></li></ul>