video-based eye tracking - sirslab sienasirslab.dii.unisi.it/papers/2005/rufa.nyacademy.2005... ·...

5
575 Ann. N.Y. Acad. Sci. 1039: 575–579 (2005). © 2005 New York Academy of Sciences. doi: 10.1196/annals.1325.071 Video-Based Eye Tracking Our Experience with Advanced Stimuli Design for Eye Tracking Software A. RUFA, a G.L. MARIOTTINI, b D. PRATTICHIZZO, b D. ALESSANDRINI, b A. VICINO, b AND A. FEDERICO a a Department of Neurological and Behavioral Sciences, Medical School, University of Siena, 53100 Siena, Italy b Department of Information Engineering, Robotics and Systems Lab, University of Siena, 53100 Siena, Italy ABSTRACT: We present an independent, flexible, and easily programmable software program for generating a wide set of visual stimuli paradigms in eye- movement studies. The software, called ASTIDET (Advanced Stimuli Design for Eye Tracking), has been interfaced in real time with a high speed video- based eye tracking system in order to obtain a reliable measurement of sac- cades. Two saccadic paradigms have been tested (gap and memory guided tasks) in 10 normal subjects. The preliminary results confirm that ASTIDET is a user-friendly software and can be interfaced with a video-based eye- tracking device in order to obtain reliable measurement of saccades. KEYWORDS: video-based eye tracking; saccades; analysis; software PURPOSE The aim of this study was to develop an independent, flexible, and easily pro- grammable software program for generating visual stimuli for eye-movement stud- ies. 1 We developed a software program called ASTIDET (Advanced Stimuli Design for Eye Tracking) as an easy-to-use program for stimulus generation, real-time data acquisition, and analysis in video-based eye-tracking applications. PROGRAM DESCRIPTION ASTIDET allows quick design of two-dimensional video sequences to be used as stimuli for a wide spectrum of eye-tracking paradigms. ASTIDET was developed in Visual C++ in order to benefit from its flexibility and also to allow possible future Address for correspondence: Antonio Federico, M.D., Dept. of Neurological and Behavioral Sciences, Medical School, University of Siena, Viale Bracci 2, 53100 Siena, Italy. Voice: +39- 0577-585763; fax +39-0577-40327. [email protected]

Upload: others

Post on 10-Aug-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Video-Based Eye Tracking - SIRSLab Sienasirslab.dii.unisi.it/papers/2005/Rufa.NYACADEMY.2005... · 2014-06-13 · the moving dot. It is also possible to make the target disappear

575

Ann. N.Y. Acad. Sci. 1039: 575–579 (2005). © 2005 New York Academy of Sciences.doi: 10.1196/annals.1325.071

Video-Based Eye Tracking

Our Experience with Advanced Stimuli Designfor Eye Tracking Software

A. RUFA,a G.L. MARIOTTINI,b D. PRATTICHIZZO,b D. ALESSANDRINI,b

A. VICINO,b AND A. FEDERICOa

aDepartment of Neurological and Behavioral Sciences, Medical School, Universityof Siena, 53100 Siena, ItalybDepartment of Information Engineering, Robotics and Systems Lab, University ofSiena, 53100 Siena, Italy

ABSTRACT: We present an independent, flexible, and easily programmablesoftware program for generating a wide set of visual stimuli paradigms in eye-movement studies. The software, called ASTIDET (Advanced Stimuli Designfor Eye Tracking), has been interfaced in real time with a high speed video-based eye tracking system in order to obtain a reliable measurement of sac-cades. Two saccadic paradigms have been tested (gap and memory guidedtasks) in 10 normal subjects. The preliminary results confirm that ASTIDETis a user-friendly software and can be interfaced with a video-based eye-tracking device in order to obtain reliable measurement of saccades.

KEYWORDS: video-based eye tracking; saccades; analysis; software

PURPOSE

The aim of this study was to develop an independent, flexible, and easily pro-grammable software program for generating visual stimuli for eye-movement stud-ies.1 We developed a software program called ASTIDET (Advanced Stimuli Designfor Eye Tracking) as an easy-to-use program for stimulus generation, real-time dataacquisition, and analysis in video-based eye-tracking applications.

PROGRAM DESCRIPTION

ASTIDET allows quick design of two-dimensional video sequences to be used asstimuli for a wide spectrum of eye-tracking paradigms. ASTIDET was developed inVisual C++ in order to benefit from its flexibility and also to allow possible future

Address for correspondence: Antonio Federico, M.D., Dept. of Neurological and BehavioralSciences, Medical School, University of Siena, Viale Bracci 2, 53100 Siena, Italy. Voice: +39-0577-585763; fax +39-0577-40327.

[email protected]

Page 2: Video-Based Eye Tracking - SIRSLab Sienasirslab.dii.unisi.it/papers/2005/Rufa.NYACADEMY.2005... · 2014-06-13 · the moving dot. It is also possible to make the target disappear

576 ANNALS NEW YORK ACADEMY OF SCIENCES

integration with advanced graphical libraries (such as VTK or Open GL) and virtualreality applications.

ASTIDET works on a PC monitor (the “editor monitor”), allowing interactive de-sign of a visual stimulus, which is then presented on a second monitor (the “scenemonitor”) in front of which the subject is seated. The editor interface allows designof a static version of the dynamic stimulus and specification of the motion (alongstraight lines) and timing parameters of the visual targets (colored dots) that makeup the visual stimulus. Additionally, the program allows integration of stimulus mul-timedia applications (e.g., MPEG, MPG, and AVI movies) that can be loaded intothe scene monitor for specific purposes. As shown in the functional scheme (FIG. 1),ASTIDET works together with a video-based eye-tracking system in which the re-mote infrared pan–tilt camera (ASL model 504 multispeed), running up to 240 Hz,tracks the eye. The infrared camera works by capturing video images of the pupil andcorneal reflection of the subject’s eye. These images (video frames) are processed inreal time by the controller module that is provided with the eye-tracking system. Thecontroller defines the line of gaze by extrapolating the x and y coordinates relativeto the screen being viewed. ASTIDET can read eye data in real time via a standardRS232 connection with the controller; thus, it is able to process online data of thesubject’s gaze and pupil size. Acquired data are prefiltered for blinks by theASTIDET program, and then filtered again (Gaussian filtering) for blinks and othernoise components of the signal using ILAB for Matlab,2 which currently includesfunctions for detection and analysis of saccades.

The editing interface (FIG. 2) consists of a grid reference with a simple and user-friendly graphic interface: by clicking the left mouse button, the operator can set thepoints through which the animated sequences will move. This task can be performedeasily using the grid reference system, allowing the supervisor to design the staticsequence precisely. The spacing between grid lines can be selected by the operator,depending on the subject-to-monitor distance and the cm/pixel ratio for the screen.The value in degrees corresponding to the position of the mouse can be visualizedon the grid interface. With the right mouse button the operator defines the parameters

FIGURE 1. Eye-tracking functional scheme. ASTIDET acts as a stimulus generator forevoking eye movements and performs real-time acquisition of eye movements and blinkremoval.

Page 3: Video-Based Eye Tracking - SIRSLab Sienasirslab.dii.unisi.it/papers/2005/Rufa.NYACADEMY.2005... · 2014-06-13 · the moving dot. It is also possible to make the target disappear

577RUFA et al.: VIDEO-BASED EYE TRACKING

describing the motion of the target, that is, the velocity, color, and size (in pixels) ofthe moving dot. It is also possible to make the target disappear for a predefined timeinterval and then to make it reappear in another part of the monitor. These propertiesallow the system to generate a wide spectrum of eye-tracking paradigms.3 In addi-tion, the previously created static scene can be saved for later use with ASTIDET.

METHODS

In order to evaluate the performance of ASTIDET, we generated specific saccadicparadigms and tested them on 10 subjects (all of whom gave informed consent) atthe University of Siena. In the experimental setup, the subject was placed in a darkroom with the eye at 72 cm from the scene monitor. The visual angle was 25°. Tominimize head movement, the head of the subject was constrained using a chinrest.The spatial resolution of the eye-tracker (0.1 deg), its sampling rate (240 Hz), andthe linear range (±30° horizontally, ±20° vertically) were sufficient for reliable mea-surement of saccades. Calibration of the tracker device was performed before eachtrial using the ASTIDET software by having the subject look at a sequence of ninetargets on the scene monitor. Offsets and infrared camera parameters were adjustediteratively by the experimenter until the line of gaze coincided with each of the ninepoints.

Each subject was tested using two frequently used saccadic paradigms (gap taskand memory-guided saccadic task), during two separate experimental sessions.4 Thegap stimulus may elicit both regular saccades and express saccades (short-latencysaccades, latency <100 ms) elicited when a novel stimulus is presented after a fixa-tion point is turned off (gap-stimulus). In this paradigm the peripheral target was pre-sented at +4°, −6°, −8°, −10°, and −12°). Memory-guided saccades move the eyestoward the location at which a peripheral cue was previously presented for a few sec-onds. In this paradigm we used a central target as the fixation point, and the lateralcue was briefly presented randomly at the same eccentric target positions used in the

FIGURE 2. Static sequences for a wide spectrum of eye-tracking paradigms can begenerated in the editor interface by ASTIDET. By clicking the left mouse button, the oper-ator can set the points from which the animated sequences will move.

Page 4: Video-Based Eye Tracking - SIRSLab Sienasirslab.dii.unisi.it/papers/2005/Rufa.NYACADEMY.2005... · 2014-06-13 · the moving dot. It is also possible to make the target disappear

578 ANNALS NEW YORK ACADEMY OF SCIENCES

gap paradigm. After a delay, during which central fixation is maintained, the centralfixation target is switched off and the subject moves the eyes to where the cue hadbeen presented. The gap and memory-guided paradigms were easily generated usingthe ASTIDET software by following the experimental parameters reported byPierrot-Deseilligny.5,6 In the gap paradigm we evaluated the peak velocity, duration,latency, and gain of each saccade, whereas in the memory-guided paradigm, we con-sidered amplitude, latency, error, and gain of each saccadic movement.

RESULTS

Results are reported in TABLE 1 for the gap-paradigm parameters. Saccades weredetected using a velocity threshold criterion of 30 deg/s. The mean values of dura-tion, peak velocity, latency, and gain were comparable with those reported inthe lit-erature.7 In this experiment we did not consider express saccades. FIGURE 3 showsthe response to the memory-guided taskfor one trial in one subject. In this task weconsidered saccadic error, latency, and gain of the saccadic eye movement. The re-sults in the bottom right corner of FIGURE 3 compare well with those reported in theliterature for normal subjects.

FIGURE 3. Memory-guided task and results. While the subject looks at a central reddot, a second visual target is briefly (50 ms) presented laterally. After a period during whichthe fixation is maintained, the central dot is turned off (go signal), and the subject’s eyesmove to where the cue had been presented.

Page 5: Video-Based Eye Tracking - SIRSLab Sienasirslab.dii.unisi.it/papers/2005/Rufa.NYACADEMY.2005... · 2014-06-13 · the moving dot. It is also possible to make the target disappear

579RUFA et al.: VIDEO-BASED EYE TRACKING

CONCLUSION

The preliminary results presented here confirm that ASTIDET can be interfacedwith a video-based eye-tracking device in order to obtain reliable measurement ofsaccades. The system is very easy to use and calibration is fairly accurate. We arecurrently working on implementing real-time digital filtering of data in theASTIDET environment. We are also implementing this software in order to inter-face it with a transcranial magnetic stimulator (TMS) and functional MRI.

REFERENCES

1. DUCHOWSKI, A.T. 2002. A breadth-first survey of eye-tracking applications. Behav.Res. Methods Instrum. Comput. 34: 455–470.

2. GITELMAN, D.R. 2002. ILAB: a program for postexperimental eye movement analysis.Behav. Res. Methods Instrum. Comput. 34: 605–612.

3. LEIGH, R.J. & D.S. ZEE. 1999. The Neurology of Eye Movements, 3rd ed. Oxford Uni-versity Press. New York.

4. LEIGH, R.J. & C. KENNARD. 2004. Using saccades as a research tool in the clinical neu-rosciences. Brain 127: 460–477.

5. PIERROT-DESEILLIGNY, C.H., C.J. PLONER, R.M. MURI, et al. 2002. Effects of corticallesions on saccadic eye movements in humans. Ann. N.Y. Acad. Sci. 956: 216–229.

6. PIERROT-DESEILLIGNY, C.H., R.M. MURI, C.J. PLONER, et al. 2003. Decisional role ofthe dorsolateral prefrontal cortex in ocular motor behaviour. Brain 126: 1460–1473.

7. BECKER, W. 1989. Metrics. In The Neurobiology of Saccadic Eye Movements. R.H.Wurtz & M.E. Goldberg, Eds.: 13–67. Elsevier. Amsterdam.

TABLE 1. Saccade analysis (mean values ± SD) for different amplitudes in terms ofduration, latency, peak velocity, and gain

Amplitude (deg) n

Duration(ms)

Peak velocity(deg/s)

Latency(ms) Gain

4 9 21.405 ± 3.611 84.22 ± 11.818 232.6 ± 1.505 1.107 ± 0.002

6 9 22.043 ± 3.354 104.37 ± 17.668 199.5 ± 1.980 1.207 ± 0.002

8 10 30.872 ± 4.352 127.22 ± 18.604 186.0 ± 2.107 1.207 ± 0.002

10 10 44.474 ± 2.861 195.92 ± 11.645 188.5 ± 0.011 1.200 ± 0.002

12 15 54.225 ± 6.853 270.92 ± 19.898 187.0 ± 3.360 1.107 ± 0.002