schepens eye research institute, harvard medical...

14
Behavior Research Methods, Instruments, & Computers 2004, 36 (4), 757-770 Commercial eye tracker manufacturers, such as SR Research (Osgoode, ON, Canada), ISCAN (Burlington, MA), and Applied Science Laboratories (ASL; Bedford, MA), market systems for monitoring point of regard on a display surface (combining head and eye tracking), but the manufacturers of these systems provide the devices as black-box tools. This makes it difficult for researchers to modify these trackers for special constraints, such as the large display and wide range of head movements needed for our walking simulator (Figure 1). Our simu- lator takes the form of a projected virtual environment (Southard, 1995) in which a participant views computer- generated images displayed on a large screen. Here, we describe a technique for addressing an important sub- problem of gaze tracking—tracking the line of primary gaze (LoPG, as will be defined below). This method al- lows participants to use head movement to “point” at lo- cations on our simulator screen. The techniques pre- sented here could be extended with the addition of a head-mounted eye tracker to allow for head-movement– compensated gaze tracking (Barabas et al., 2003). There exist a number of techniques for tracking eye movements (Young & Sheena, 1975), but relatively little has been written on calibration methods for these sys- tems. Allison, Eizenman, and Cheung (1996) developed a combined head and eye-tracking apparatus for testing of the vestibular system, but they attributed errors to a lack of calibration. Another combined head and eye- tracking system was built for use in a projected virtual environment (Asai, Osawa, Takahashi, & Sugimoto, 2000), but few details of calibration methodology were pro- vided. Gaze-tracking systems that combine head and eye movements have also been developed for head-mounted displays. Duchowski et al. (2002) described a virtual re- ality system based on a head-mounted display that in- corporates a gaze-tracking system. Detailed descriptions of geometry and calibration techniques were provided, but the authors addressed a sufficiently different prob- lem that their methods cannot be directly applied to pro- jected virtual environments. Toward developing a gaze-tracking system for our walking simulator, we separated the task into two parts: tracking the movement of the participant’s head and track- ing the rotation of the eye in its socket. In this article, we will explore the first of these two parts. Techniques for tracking the head are useful for tasks in which a participant “points” at objects by moving his or her head. Such systems are used in tactical aircraft: With the aid of a crosshair projected on a helmet-mounted dis- play, pilots can aim weaponry by turning their heads, aligning the crosshair with a target (King, 1995). We de- veloped a similar system to allow participants in our walking simulator to “point” at objects on a screen. By equipping our participants with a head-mounted sight (see Figure 2) and using a magnetic head tracker and a tracking algorithm (described in the Method section), we are able to find the point on the simulator’s screen that is aligned with a mark at the center of the sight. For exper- iments, we align this sighting mark to appear “straight ahead” of the participant’s left eye. This is the primary position of gaze (Leigh & Zee, 1983). When the partici- pant foveates a target on the projection screen through 757 Copyright 2004 Psychonomic Society, Inc. This work was supported in part by National Institutes of Health Grant EY12890. Correspondence concerning this article should be ad- dressed to E. Peli, Schepens Eye Research Institute, 20 Staniford St., Boston, MA 02114 (e-mail: [email protected]). Tracking the line of primary gaze in a walking simulator: Modeling and calibration JAMES BARABAS, ROBERT B. GOLDSTEIN, HENRY APFELBAUM, RUSSELL L. WOODS, ROBERT G. GIORGI, and ELI PELI Schepens Eye Research Institute, Harvard Medical School, Boston, Massachusetts This article describes a system for tracking the line of primary gaze (LoPG) of participants as they view a large projection screen. Using a magnetic head tracker and a tracking algorithm, we find the on- screen location at which a participant is pointing a head-mounted crosshair. The algorithm presented for tracking the LoPG uses a polynomial function to correct for distortion in magnetic tracker readings, a geometric model for computing LoPG from corrected tracker measurements, and a method for find- ing the intersection of the LoPG with the screen. Calibration techniques for the above methods are pre- sented. The results of two experiments validating the algorithm and calibration methods are also re- ported. Experiments showed an improvement in accuracy of LoPG tracking provided by each of the two presented calibration steps, yielding errors in LoPG measurements of less than 2º over a wide range of head positions. Source code for the described algorithms can be downloaded from the Psychonomic Society Web archive, http://www.psychonomic.org/archive/.

Upload: vanthuy

Post on 12-Jun-2018

213 views

Category:

Documents


0 download

TRANSCRIPT

Behavior Research Methods Instruments amp Computers2004 36 (4) 757-770

Commercial eye tracker manufacturers such as SRResearch (Osgoode ON Canada) ISCAN (BurlingtonMA) and Applied Science Laboratories (ASL BedfordMA) market systems for monitoring point of regard ona display surface (combining head and eye tracking) butthe manufacturers of these systems provide the devicesas black-box tools This makes it difficult for researchersto modify these trackers for special constraints such asthe large display and wide range of head movementsneeded for our walking simulator (Figure 1) Our simu-lator takes the form of a projected virtual environment(Southard 1995) in which a participant views computer-generated images displayed on a large screen Here wedescribe a technique for addressing an important sub-problem of gaze trackingmdashtracking the line of primarygaze (LoPG as will be defined below) This method al-lows participants to use head movement to ldquopointrdquo at lo-cations on our simulator screen The techniques pre-sented here could be extended with the addition of ahead-mounted eye tracker to allow for head-movementndashcompensated gaze tracking (Barabas et al 2003)

There exist a number of techniques for tracking eyemovements (Young amp Sheena 1975) but relatively littlehas been written on calibration methods for these sys-tems Allison Eizenman and Cheung (1996) developeda combined head and eye-tracking apparatus for testingof the vestibular system but they attributed errors to alack of calibration Another combined head and eye-

tracking system was built for use in a projected virtualenvironment (Asai Osawa Takahashi amp Sugimoto 2000)but few details of calibration methodology were pro-vided Gaze-tracking systems that combine head and eyemovements have also been developed for head-mounteddisplays Duchowski et al (2002) described a virtual re-ality system based on a head-mounted display that in-corporates a gaze-tracking system Detailed descriptionsof geometry and calibration techniques were providedbut the authors addressed a sufficiently different prob-lem that their methods cannot be directly applied to pro-jected virtual environments

Toward developing a gaze-tracking system for ourwalking simulator we separated the task into two partstracking the movement of the participantrsquos head and track-ing the rotation of the eye in its socket In this article wewill explore the first of these two parts

Techniques for tracking the head are useful for tasks inwhich a participant ldquopointsrdquo at objects by moving his orher head Such systems are used in tactical aircraft Withthe aid of a crosshair projected on a helmet-mounted dis-play pilots can aim weaponry by turning their headsaligning the crosshair with a target (King 1995) We de-veloped a similar system to allow participants in ourwalking simulator to ldquopointrdquo at objects on a screen Byequipping our participants with a head-mounted sight(see Figure 2) and using a magnetic head tracker and atracking algorithm (described in the Method section) weare able to find the point on the simulatorrsquos screen that isaligned with a mark at the center of the sight For exper-iments we align this sighting mark to appear ldquostraightaheadrdquo of the participantrsquos left eye This is the primaryposition of gaze (Leigh amp Zee 1983) When the partici-pant foveates a target on the projection screen through

757 Copyright 2004 Psychonomic Society Inc

This work was supported in part by National Institutes of HealthGrant EY12890 Correspondence concerning this article should be ad-dressed to E Peli Schepens Eye Research Institute 20 Staniford StBoston MA 02114 (e-mail elivisioneriharvardedu)

Tracking the line of primary gaze in a walkingsimulator Modeling and calibration

JAMES BARABAS ROBERT B GOLDSTEIN HENRY APFELBAUM RUSSELL L WOODS ROBERT G GIORGI and ELI PELI

Schepens Eye Research Institute Harvard Medical School Boston Massachusetts

This article describes a system for tracking the line of primary gaze (LoPG) of participants as theyview a large projection screen Using a magnetic head tracker and a tracking algorithm we find the on-screen location at which a participant is pointing a head-mounted crosshair The algorithm presentedfor tracking the LoPG uses a polynomial function to correct for distortion in magnetic tracker readingsa geometric model for computing LoPG from corrected tracker measurements and a method for find-ing the intersection of the LoPG with the screen Calibration techniques for the above methods are pre-sented The results of two experiments validating the algorithm and calibration methods are also re-ported Experiments showed an improvement in accuracy of LoPG tracking provided by each of thetwo presented calibration steps yielding errors in LoPG measurements of less than 2ordm over a wide rangeof head positions Source code for the described algorithms can be downloaded from the PsychonomicSociety Web archive httpwwwpsychonomicorgarchive

758 BARABAS ET AL

the sighting mark this locates the line of sight The cen-ter of rotation of the eye lies approximately along the lineof sight so that the fixation axis (line between the cen-ter of rotation and the fixated target) is a close approxi-mation of the line of sight (Atchison amp Smith 2000)When the sighting mark is placed so that the participantrsquoseye is in the primary position of gaze while aligning thesight and the target this is the LoPG Thus in our sys-tem the LoPG connects the center of rotation of the par-ticipantrsquos left eye and a point of regard

Several calibration challenges must be overcome foraccurate tracking of the LoPG Our walking simulatoruses a magnetic tracker (Blood 1990) to measure loca-tion and orientation of the participantrsquos head This typeof device also used in biomechanics (Day Dumas ampMurdoch 1998) virtual reality (Livingston amp State1997) and eye-tracking research (Caldwell Berbaum ampBorah 2000) is subject to environmental magnetic dis-tortions that worsen with the distance between the mag-netic transmitter and the magnetic sensor Compensationfor this distortion is required if measurements are to betaken from near the limits of the trackerrsquos range or if thetracker is to be used in close proximity to large metallicobjects (Nixon McCallum Fright amp Price 1998)

Several techniques for compensating for this kind ofdistortion have been proposed and a summary of thesemethods has been compiled by Kindratenko (2000) Mostof the techniques reviewed by Kindratenko were developedfor virtual reality (allowing for presentation of motionparallax effects as viewers move their heads) but are ap-plicable to tracking the LoPG These distortion compen-sation techniques generally require custom equipmentand are not included in popular commercial tracking sys-tems For example in a study in which the eye move-ments of radiologists were investigated Caldwell et al(2000) used an ASL eye-tracking system but they em-ployed a modified version of the ASL EYEPOS softwareand a custom calibration fixture to correct for magneticdistortion This method allowed for more accurate gazetracking near a metallic device but few details of thesoftware modifications were published

In addition to compensating for magnetic distortionone must also find the alignment between the physicalcomponents of the tracking system The distances be-tween these components and their relative orientationsbecome parameters in the LoPG-tracking model Al-though direct measurement of these parameters is possi-ble small errors in these measurements can lead to largeerrors in computed LoPG Developers of augmented re-ality systems (where visual information is overlaid on anobserverrsquos view using a see-through head-mounted dis-play) have explored geometric models and calibrationtechniques like those used in eye tracking Calibration isrequired for aligning overlaid images with real-world ob-jects Janin Mizell and Caudell (1993) proposed cali-brating the location of information presented in a head-mounted display by aligning targets projected in thedisplay with markers on a real-world workbench in a se-ries of intentional gazes We use a similar approach forLoPG calibration

Our system allows for accurate (mean error of lessthan 2ordm) tracking of LoPG over a wide range of head lo-cations and orientations We compute the LoPG using amodel of the geometry of gazes with a few simplifyingassumptions Doing so allows model parameters in theLoPG-tracking algorithm to correspond to actual distancesand angles between parts of the tracking system makingdirect verification possible The general approach of thismethod can also be extended to incorporate additional oralternate tracking systems

Figure 1 Experimental configuration of the walking simulatorshowing the locations of the walking platform the wide projec-tion screen and the magnetic tracker transmitter

Figure 2 Head-mounted sight The participant aligned objects of interest with asighting mark When viewing objects through the mark the left eye was held in theprimary position of gaze

TRACKING THE LINE OF PRIMARY GAZE 759

In this article we first will describe our algorithms forcompensating for magnetic distortion and for finding theLoPG from compensated tracker measurements Thenwe will describe our calibration methodology for findingparameters for the above procedures Finally we will de-scribe and present the results of two experiments used tovalidate our method

METHOD

ApparatusThe LoPG tracking system we used to validate our

tracking algorithm and calibration techniques was builtas a part of our walking simulator It consisted of the fol-lowing

Projection screen A 67 50 in rear projection screenwas used for presentation of calibration targets The bot-tom of the screen was elevated 30 in above our walkingplatform (see Figure 1) to place its center near eye levelfor walking participants

Magnetic tracker An Ascension Flock-of-Birds mag-netic tracker with an extended range transmitter (AscensionTechnology Corporation Burlington VT) was used tomonitor head position This system reports location andorientation of a small magnetic sensor in relation to atransmitter Each measurement consists of six variablesthree representing location in three dimensions and threeEuler angles representing orientation The tracker trans-mitter was mounted on a 48-in-tall wood stand and wasplaced 52 in to the right and 36 in in front of the pro-jection screen center The tracker sensor was mounted onadjustable headgear worn by the participant (Figure 2)The headgear also carried the head-mounted sight

Head-mounted sight A head-mounted sight wasused to allow the participants to consistently direct theirLoPG at projected screen targets The sight was a 18-inhollow square mark on a clear plastic panel which wasin turn mounted on a 10-in lightweight brass boom ex-tending from the headgear

Computer A Pentium computer running MicrosoftWindows 2000 Matlab 65 (The MathWorks Inc NatickMA) and custom software was used to generate screenimages of calibration targets to control and record datafrom the magnetic tracker and to perform calibrationand gaze computations Calibration software was writ-ten using Matlab and is available for download from thePsychonomic Society Web archive

LoPG Tracking AlgorithmWe developed an algorithm to transform raw magnetic

tracker measurements into experimental variables of in-terest for behavior research The algorithm consists ofthree steps distortion compensation geometric trans-formation and computation of the point of gaze Distor-tion compensation consists of correcting raw magnetictracker readings to compensate for the spatial distortionsthat are characteristic of these devices In this step weattempt to compute the true location and orientation of

the magnetic sensor from its reported location and ori-entation In the second step (geometric transformation)distortion-compensated measurements are used to findthe LoPG relative to the projection screen Finally theLoPG is used to compute the point on the screen alignedwith the sighting mark We call this on-screen locationthe point of gaze The first and second steps of this al-gorithm rely on parameters that are obtained through apair of calibration processes described below in the Cal-ibration section

Distortion compensation Examination of readingsfrom our magnetic tracker confirmed other research(Bryson 1992) that has shown that distortion in both thelocation and the orientation components of measure-ments varied dramatically depending on location of themagnetic sensor To account for these distortions we usea compensation function that adds a correction factor tomeasurements This correction depends on the locationportion of magnetic sensor readings Specifically tocompensate for distortion the translational componentsof each raw measurement from the magnetic tracker (xoyo and zo representing the location in three dimensionsof the magnetic sensor relative to the transmitter) aretransformed with a polynomial distortion correctionfunction (Kindratenko 1999) The correction consists ofthree degree-n polynomials in three variables Each ofthese three polynomials contains terms consisting of theproduct of xo yo and zo raised to powers in all combi-nations where the sum of the exponents is n or fewer

(1)

where aijk bijk and cijk are t (n1)(n2)(n3)6 poly-nomial coefficients (obtained in the calibration process)and xc yc and zc consist of the coordinates of the cor-rected measurement Orientation of the sensor reportedby the magnetic tracker as Euler angles qo fo and yo iscorrected using a similar set of polynomials

(2)

where dijk eijk and fijk are t polynomial coefficients andqc fc and yc are corrected orientation angles Correct-ing for distortion in this way assumes that the distortionin orientation measurements depends only on the loca-tion of the magnetic sensor and not at all on orientationAlthough distortion in orientation measurements doesindeed vary with sensor orientation (Livingston amp State1997) this form of distortion compensation is still ef-fective if the sensor remains in the same relative orien-tation in both calibration and experimental use Polyno-mial coefficients in Equations 1 and 2 correspond to theshape of the specific distortion in a given experimental

qf

y

qf

y

c

c

c

o

o

o

o o o

Egrave

Icirc

IacuteIacuteIacute

˘

˚

˙˙˙

=Egrave

Icirc

IacuteIacuteIacute

˘

˚

˙˙˙

+Egrave

Icirc

IacuteIacuteIacute

˘

˚

˙˙˙

loz loz - -

=

-

==AcircAcircAcirc

d

e

f

x y zijk

ijk

ijk

j k i j k

k

i j

j

i

i

n

000

x

y

z

x

y

z

a

b

c

x y zijk

ijk

ijk

j k i j k

k

i j

j

i

i

nc

c

c

o

o

o

o o o

Egrave

Icirc

IacuteIacuteIacute

˘

˚

˙˙˙

=Egrave

Icirc

IacuteIacuteIacute

˘

˚

˙˙˙

+Egrave

Icirc

IacuteIacuteIacute

˘

˚

˙˙˙

loz loz - -

=

-

==AcircAcircAcirc

000

760 BARABAS ET AL

configuration and can be computed using the calibrationprocedure described below In agreement with the find-ings of Kindratenko (2000) we found that a polynomialof degree n 4 modeled the shape of the distortion wellA fourth-degree polynomial required t 35 coefficientsfor each tracker variable (210 in total)

Geometric transformation The second computa-tional step of the LoPG tracking algorithm transformsthe distortion-compensated magnetic tracker readings tolocate LoPG in relation to the screen This transforma-tion employs a geometric model of the spatial relation-ship between the eye and the magnetic sensor and of therelationship between the screen and the magnetic trans-mitter (Figure 3) The set of 12 model parameters com-puted in the geometric calibration procedure (describedbelow) characterizes these two spatial relationships

The model takes each of the following locations withinthe tracking environment to be the origin of a three-dimensional coordinate frame the upper left corner ofthe screen (coordinate frame O) the measurement originof the magnetic transmitter (B) the measurement centerof the magnetic sensor mounted on the head (S) and the

center of rotation of the eye (E) The x y and z axes ofeach coordinate frame are oriented as shown in Figure 3

We represent the spatial relationships between pairs ofthese coordinate frames as 4 4 homogeneous trans-formation matrices Each such matrix can describe therelative locations and orientations of two coordinateframes and allows measurements in one frame to betransformed to the other by matrix multiplication Foreach such matrix

(3)

representing the transformation from some frame a to someframe b xba yba and zba describe the three-dimensionallocation of the origin of a in the coordinate system of bThe r variables are the lengths of the projections of unitvectors along each of the three coordinate axes of framea onto the axes of frame b These projections form therotational component of the transformation

tb auml

=

Egrave

Icirc

IacuteIacuteIacuteIacute

˘

˚

˙˙˙˙

r r r x

r r r y

r r r z

ba

ba

ba

11 12 13

21 22 23

31 32 33

0 0 0 1

Figure 3 Geometric model for finding line of primary gaze by combining (via matrix multiplication) thespatial relationship measured by the magnetic tracker (ttBumlS corrected for magnetic distortion) with thosederived through geometric calibration (ttOumlB and ttSumlE) B is the tracker transmitter S is the tracker sen-sor E is the eye and O is the projection screen

TRACKING THE LINE OF PRIMARY GAZE 761

In addition to the matrix representation of the relation-ship between two coordinate frames it is often conve-nient to use a translation and Euler angle representationIn this way a transformation can be described compactlywith 6 variables instead of the 12 used in the matrix formThe three coordinates of translation and the three Eulerangles that specify a transformation Mbumla (xba yba zbaqba fba yba) can be converted to matrix form by a se-ries of three rotations and a displacement tbumla as shownin Equation 4 at the bottom of the page (Kuipers 1999)

Using these conversions the (corrected) measure-ments provided by the magnetic tracker can be used toderive the other spatial relationships described in Fig-ure 3 The computations of the various transformationsare described next followed by a description of the cal-ibration process

Since the LoPG lies along the x-axis of the eye frameE to find the LoPG relative to the screen we find the lo-cation and orientation of this frame with respect to thescreen frame O To find this relationship we composethe matrix representations of the spatial relationshipsmeasured by the magnetic tracker with matrices com-puted through calibration The position of the magnetictransmitter with respect to the screen tOumlB (transforma-tion from transmitter frame to screen frame) and that ofthe eye with respect to the sensor tSumlE remain fixed forthe duration of an experiment The transformation tBumlSis constructed using Equation 4 from the six variablesMBumlS (xc yc zc qc fc yc) resulting from the distortioncompensation step (Equations 1 and 2) The coordinatesxc yc and zc represent the corrected three-dimensionaldisplacement between the origins of B and S while qcfc and yc are the corrected Euler angles describing theorientation of S in the coordinate system of B

Since composite transformations can be representedas the product of transformation matrices we can findthe location and orientation of the eye relative to thescreen tOumlE as follows

(5)

Once tOumlE is computed we can find both the eye loca-tion with respect to the screen1 (simply the last columnof tOumlE) and the point of intersection between the LoPGand the screen

Point of gaze computation The point on the screenthat intersects the participantrsquos LoPG can be found fromtOumlE the result of the geometric transformation portionof the gaze-tracking algorithm In our model the LoPGfalls along the x-axis of the eye coordinate frame E Wesolve for a point that is on the x-axis of the eye frame and

also falls in the xndashy plane of the screen coordinate frameOmdashthe surface of the screen This intersection can beformulated as a linear system consisting of a point on thex-axis of frame E (xo 0 0) transformed by tOumlE and setequal to a point in the xndashy plane of frame O (Intx Inty 0)

(6)

Solving this system for Intx and Inty gives us the inter-section between the LoPG and the plane of the screenexpressed in the screen coordinate frame

CalibrationParameters used in the LoPG tracking algorithm are

computed via two calibration steps First the distortionin magnetic tracker readings is measured and coeffi-cients for the compensation polynomials (Equations 1and 2) are computed This distortion compensation cal-ibration process was carried out when our tracking sys-tem was installed Computed coefficients can be reusedassuming that there have been no changes to the relativeplacement of the tracker transmitter and the distortion-causing (large metal) objects The second geometric cal-ibration step computes the spatial relationship betweenthe magnetic tracker sensor and the eye along with therelationship between the tracker transmitter and the screenimage This second step is performed once for each par-ticipant before an experimental run since an adjustment ofthe headgear or of the image projected on the screen al-ters the spatial relationships measured in this calibration

Distortion compensation calibration To computecoefficients aijk bijk cijk dijk eijk and fijk of the distortioncompensation polynomials (Equations 1 and 2) we com-pare a set of tracker readings taken over a grid of knownpoints within the tracking volume to the hand-measuredlocations of those points The differences between themeasured and the reported locations form a set of vectorsmapping points in reported space to actual physical lo-cations (these difference vectors were small near thetransmitter and became largemdashsee Figure 5mdashnear theouter third of the nominal volume of our trackerrsquos range)We then solve for the set of polynomial coefficients thatwhen used in Equation 1 minimizes the sum of the squaredlengths of these error vectors An identical procedure isperformed on orientation data finding coefficients thatminimize the sum of the squared errors in Euler anglesreported by the tracker

tO E

o

uml

Egrave

Icirc

IacuteIacuteIacuteIacute

˘

˚

˙˙˙˙

=

Egrave

Icirc

IacuteIacuteIacuteIacute

˘

˚

˙˙˙˙

x Int

Intx

y0

0

1

0

1

O E O B B S S Euml uml uml uml=t t t t

(4)b auml

=

Egrave

Icirc

IacuteIacuteIacuteIacute

˘

˚

˙˙˙˙

-Egrave

Icirc

IacuteIacuteIacuteIacute

˘

˚

˙˙˙˙

-

Egrave

Icirc

IacuteIacuteIacuteIacute

˘

˚

˙t

q qq q

f f

f f

1 0 0

0 1 0

0 0 1

0 0 0 1

0 0

0 0

0 0 1 0

0 0 0 1

0 0

0 1 0 0

0 0

0 0 0 1

x

y

z

ba

ba

ba

ba ba

ba ba

ba ba

ba ba

cos( ) sin( )

sin( ) cos( )

cos( ) sin( )

sin( ) cos( )˙˙˙

-Egrave

Icirc

IacuteIacuteIacuteIacute

˘

˚

˙˙˙˙

1 0 0 0

0 0

0 0

0 0 0 1

cos( ) sin( )

sin( ) cos( )

y yy y

ba ba

ba ba

762 BARABAS ET AL

In our walking simulator environment we constructeda system of pegboards and stands to provide a set ofknown locations for placement of the magnetic sensorduring this procedure The pegboard system illustratedin Figure 4 was constructed from 24 48 in sheets ofpredrilled 025-in-thick pegboard mounted on a woodenframe Screws used in the construction of the frame werereplaced with glue since we found them to be causingsmall local distortions in magnetic tracker readings Thesensor stands were constructed from sections of 4-in-diameter PVC pipe with flat pegboard caps on each endEach stand held the magnetic tracker sensor rigidly onone end cap and had pegs for attachment to the pegboardon the other The sensor was positioned so that when thestands were engaged with the pegboard the sensorrsquosx-axis (see Figure 4) pointed toward the screen and itsz-axis pointed toward the floor In this orientation themagnetic tracker reported azimuth elevation and roll ofthe sensor (qo fo and yo) to all be near zero (these an-gles deviated from zero by a few degrees due to distor-tion) Several pipe lengths were used for measurementsat varying heights (0 6 12 and 18 in) above the peg-board plane Other devices for placing a sensor at knownlocations have also been developed including a fixturemade entirely of Plexiglas (Caldwell et al 2000) and anoptical tracking system (Ikits Brederson Hansen ampHollerbach 2001)

The pegboard frame was placed in our tracking envi-ronment and was mechanically squared with the case ofthe magnetic transmitter The location of the pegboard

grid with respect to the magnetic transmitter was thenmeasured2 Readings from the magnetic tracker weretaken over both a sparse three-dimensional grid of 64locations spanning 45 52 18 in (see Figure 5) anda denser smaller grid of 128 locations covering the cen-tral 32 40 18 in of the same space The magnetictracker readings from both grids were combined to formthe data set of reported locations and orientations UsingMatlab we performed linear least-squares fits on the dif-ferences between reported and pegboard-measured datasets to find the 210 coefficients for the distortion cor-rection polynomials3 An illustration of the magnitude oflocation distortion for our tracker is provided in Figure 5

Geometric calibration The second calibration stepcomputes the geometric model parameters describingspatial relationships tOumlB and tSumlE These parameterscorresponding to the alignment of the screen with thetracker transmitter (tOumlB) and position of the magnetictracker sensor relative to the eye (tSumlE) are difficult tomeasure directly and tSumlE changes from participant toparticipant or whenever the headgear is adjusted Forthese reasons we developed a calibration procedure toquickly ascertain these model parameters

Geometric calibration of the LoPG-tracking system isperformed by taking magnetic tracker recordings whileparticipants perform a series of constrained gazes Foreach gaze the participant using his or her left eye alignsthe head-mounted sight with a target dot projected on thescreen The participant repeats this procedure from a se-ries of standing locations within the desired usage space

Figure 4 Pegboard apparatus for measuring distortion in magnetic tracker measure-ments Two such pegboard panels were mounted on a wooden frame to calibrate a volumemeasuring 45 52 18 in Stands of various heights were placed at different locations acrossthe base pegboard to provide wide coverage of the volume

TRACKING THE LINE OF PRIMARY GAZE 763

After a number of such gazes a pair of tOumlB and tSumlEparameters is computed by the GaussndashNewton nonlinearleast-squares fitting method as will be described in thenext section

Geometric calibration algorithm The fitting proce-dure seeks the set of model parameters that when com-bined with the geometric model in our LoPG-trackingalgorithm best predict on-screen points of gaze fromdistortion-compensated magnetic tracker readings takenover a set of directed gazes Our model computes Int thetwo-dimensional on-screen point of gaze expressed as

Int F(P MBumlS) (7)

where F is the function that computes point of gaze froma given (corrected) magnetic tracker reading and a set ofmodel parameters (Equations 5 and 6) P is the set of 12model parameters characterizing via Equation 4 thespatial relationships tOumlB and tSumlE (the 12 parametersdescribe for each of the two transformations a three-dimensional displacement between coordinate framesand three Euler angles describing relative orientations ofthe coordinate frames) MBumlS is the magnetic trackermeasurement (three distances and three angles) cor-

rected for distortion by Equations 1 and 2 We assessquality of the fit found by the geometric calibration pro-cedure by the size of the average error across all gazes ina calibration This mean gaze point error is expressed asthe sum across all calibration gazes of the distancesalong the screen between the actual calibration targetsand the model-predicted points of gaze The fitting pro-cedure seeks model parameters P that minimize thissum given a set of magnetic tracker measurements MBumlSa set of corresponding screen intersections Int and aninitial ldquoguessrdquo for P Starting with the given guess for Pthe fitting procedure iteratively adjusts these parametersuntil a local minimum in mean gaze point error is found4Minimization is performed in Matlab using the nlinfitfunction (from the statistics toolbox an implementationof the GaussndashNewton fitting method) In our implemen-tation this technique finds a P corresponding to a givenset of 20 measurements of Int and MBumlS in a few sec-onds on a 700-MHz PC

Experimental ValidationTo test the calibration and LoPG-tracking techniques

presented in this article we first calibrated the distortion

y coordinate of magnetic transmitter frame (inches)x coordinate of magnetic transmitter frame (inches)

z coordinate of magnetic transm

itter frame

Actual shape ofmeasured space

Reported shape of measuredspace due to distortion

Projection

Screen

ndash90ndash80

ndash70ndash60

ndash50ndash40

ndash30ndash20

ndash100

1040

30

20

10

0

ndash10

ndash20

ndash30ndash40

ndash30

ndash20

ndash10

0

10

20

Magnetic TrackerTransmitter

Figure 5 Distortion measured in performing distortion compensation calibration of magnetic trackerAxis units are inches in the coordinate system of the magnetic tracker transmitter Dark bars on axes in-dicate the ranges of physical locations sampled in the calibration procedure Shade of surface indicatesmagnitude of distortion in location measurements from the magnetic tracker The darkest areas indicatedistortion in location measurement of over 13 in

764 BARABAS ET AL

compensation system as described above and then per-formed two target-sighting experiments In the first ex-periment we performed the geometric calibration pro-cedure using the beam of a laser pointer to represent theline of gaze of the human participant Doing so allowedthe simulated LoPG to be precisely aligned with calibra-tion targets projected on the screen and allowed testingwithout the variability of human participantsrsquo ability tomaintain head position To fix the relative placement ofthe laser and sensor both were attached to a small woodenblock The sensor was placed on the block with its x-axisapproximately parallel to and 2 in above the laser beamThis alignment enabled us to easily measure the actualspatial relationship between the sensor and the laser forvalidation of the parameters found by the geometric cali-bration procedure (We found it much more difficult tomeasure the equivalent relationship tSumlE on a humanparticipant) The wood block holding the sensor and laserwas fastened to an aluminum and plastic tripod raised tothe approximate height of a human participantrsquos head Togather data for testing the geometric calibration proce-dure the tripod was placed at 10 different locations Theselocations were chosen within the range of possible stand-ing positions of our study participants (an area about 36 36 in) For each of the 10 tripod locations readings fromthe magnetic tracker were taken panning and tilting thetripod head so that the laser beam fell on each calibrationtarget Four of the targets were just inside the corners ofthe screen one was placed in the center and five other lo-cations were arbitrarily chosen The targets remained inthe same locations throughout each experiment Due tothe distance between the sensor and the pivot point of thetripod head for a given tripod location the actual magneticsensor locations varied by up to 8 in as the tripod headwas tilted at different angles to align the laser

In a second experiment a similar set of 100 directedgazes was performed by a human participant The par-ticipant used the head-mounted sight (Figure 2) to alignhis LoPG with each of the 10 targets from each of 10standing locations The 5 calibration targets that were ar-bitrarily placed in the first experiment were in differentlocations for calibration of the human participant

Data from both experiments were independently usedto calibrate the LoPG-tracking system and the benefitof distortion compensation and geometric calibrationprocedures were examined For the distortion compen-sation step we used polynomial coefficients computedfrom the 192 pegboard locations as described above Inall cases we used as an initial guess for the geometriccalibration procedure a set of hand-measured parametersthat approximated the actual geometry of the system

RESULTS

For each of the two experiments collected data con-sisted of a set of 100 six-variable tracker readings (eachreading taken when the laserLoPG was aligned to fall atthe center of an on-screen calibration target) as well as

the 100 two-dimensional screen locations of the corre-sponding calibration targets (10 repetitions of 10 tar-gets) Location and orientation data from the magnetictracker were passed through the distortion compensationfunctions (Equations 1 and 2) yielding a second set of100 corrected tracker readings for each experiment Weused subsets of this body of distortion-corrected data toderive the transformation matrices tOumlB and tSumlE usingthe geometric calibration procedure described aboveand tested these fits against the remaining data We com-pared known screenndashtarget locations to point-of-gazecomputations made with and without distortion com-pensation or geometric calibration We also investigatedthe behavior of the error as a function of the number ofincluded calibration points by varying the size of the cal-ibration data sets used to compute tOumlB and tSumlE

Accuracy of Predicted Point of GazeUsing the distortion-corrected location and orientation

measurements from the entire set of 100 laserndashtargetalignments from the first experiment and the geometriccalibration procedure the tOumlB and tSumlE matrices werecomputed Equations 5 and 6 were then used to predictpoint of gaze for each of the 100 laserndashtarget alignmentsand mean gaze point error (mean distance along the screenbetween calibration target and point of gaze found withthe LoPG-tracking algorithm) was computed For thedata set from the first experiment mean error was 150 in(Figure 6D) Error can also be described as the angle be-tween the screen target and the computed point of gazewhen viewed from the location of the magnetic sensorMeasured in this way mean angular error was 155ordm Forreasons described in the Discussion section we will re-port the remainder of our mean error results in inches Atour screenndasheye distances 1ordm corresponds approximatelyto 1 in of error

To examine the effect of distortion compensation onmean gaze point error we also performed the geometriccalibration procedure using raw magnetic tracker mea-surements not processed by the distortion correctionpolynomials As seen in Figure 6B without distortioncompensation mean gaze point error was significantlylarger than with compensation (Wilcoxon signed rankstest Z 366 p 001) at 193 in when calibratedusing data from the same experiment (an increase of 29)Without distortion compensation clear outliers from theclusters of points of gaze can be seen for most of thescreen targets in Figure 6B These outlying points ofgaze were all from the tripod location most distant fromthe magnetic transmitter where distortion was largestDistortion compensation provided most benefit for thistripod location

To examine the effect of the geometric calibration pro-cedure on accuracy of predicted LoPG we also com-puted mean gaze point error across the collected data setusing only a hand-measured estimate of tOumlB and tSumlEDistortion correction was included but no geometriccalibration was performed As one might expect without

TRACKING THE LINE OF PRIMARY GAZE 765

calibration errors were larger and this difference wassignificant (Wilcoxon signed ranks test Z 809 p 001) Mean gaze point error using an uncalibrated modelwas 396 in as can be seen in Figure 6C

We also examined the ability of the geometric model tocompute points of gaze without both distortion compensa-tion and geometric calibration The hand-measured tOumlBand tSumlE and raw magnetic tracker readings were used inEquations 5 and 6 Mean gaze point error was 566 in (Fig-ure 6A) The addition of either geometric calibration ordistortion compensation yielded a significant reduction inerror (Wilcoxon signed ranks test Z 61 p 001)

Conducting the same analysis for the data from thesecond experiment we found mean gaze point error tobe lower than that in the first experiment when geomet-ric calibration was performed When both distortioncompensation and geometric calibration were used meanerror was 072 in When geometric calibration was per-formed on data not corrected for distortion mean error was106 in (Wilcoxon signed rank test Z 520 p 001)Reasons for the reduced errors in the second experiment

are proposed in the Discussion section Without geometriccalibration error was much larger (Wilcoxon signed ranktest Z 81 p 001) since we were unable to measuretSumlE with the same accuracy for the second experimentWith and without distortion correction mean errors were398 and 426 in respectively

Effect of sensor location on error Using the rawdata (not compensated for distortion) from the first ex-periment and computing point of gaze without perform-ing geometric calibration the farther the sensor wasfrom the transmitter the worse the error (Spearman cor-relation r 46 p 001) and that error was manifestas an increasing error in the y-dimension (Spearman cor-relation r 50 p 001 Figure 7A) When both thedistortion correction and the geometric calibration wereimplemented the errors were more evenly distributed(Figure 7B) resulting in no significant correlation withdistance from the transmitter (Spearman correlation r 04 p 71) There was however a significant tendencyfor the errors to increase toward the screen (ie in thex-dimension Spearman correlation r 22 p 003)

Figure 6 Effects of distortion compensation (panels C and D) and geometric calibration (panels B and D) oncomputation of points of gaze Data were collected with a tripod-mounted laser pointer used to simulate line ofprimary gaze (LoPG) Circular marks indicate screen location of calibration targets Crosses are laser screen in-tersections (points of gaze) computed by LoPG-tracking algorithm Within each cluster each cross represents ameasurement from a different tripod location Rectangular outline shows location of projection screen Screenoutline and targets appear at hand-measured location (panels A and C) and location found by geometric cali-bration (panels B and D)

766 BARABAS ET AL

This may be because for tripod locations closer to thescreen an angular error (due to distortion for example)of the same size results in a larger error distance alongthe screen

Short calibration In order to use our LoPG-trackingsystem as a tool for gathering data from untrained studyparticipants it is advantageous to calibrate the systemusing as few data points as necessary (eg to reduce fa-tigue) To find the number of calibration data pointsneeded to accurately predict point of gaze we createdcalibration sets from subsets of the 100 measurementscollected in each experiment We used these small cali-bration sets to compute tOumlB and tSumlE and then usedthose computed parameters to calculate screen intersec-tions across the full set of 100 measurements We thencompared these computed intersections with the loca-tions of the actual locations of the on-screen targets Weperformed this technique for calibration sets of varyingsizes across 100 random orderings of the 100 data pointscollected in each experiment For the first experimentwe found that after about 20 points the geometric cali-

bration system converges to a point of gaze with a meangaze point error of less than 2 in with only small im-provements attained by including additional points in thecalibration set (Figure 8)

For the second experiment overall gaze point errorwas lower when calibration was performed with the en-tire data set For this data set 2 in of error could bereached with about 12 data points (Figure 9) Calibra-tions performed with more than 15 data points providedonly a small additional benefit

In the processing of calibration data from both exper-iments geometric calibration occasionally failed com-pletely producing larger mean gaze point error than ifgeometric calibration had not been performed at allThese failures occurred with very small calibration setswhen distortion-related errors provided inconsistent con-straints for the model Failures also occurred when setsincluded targets from only one part of the screen insuf-ficiently constraining the model Fitting failures neveroccurred with sets of 10 points or more and never occurredwhen distortion compensation was performed In all the

Figure 7 The mean gaze point error at each of the 10 tripod locations on the left using the raw data from themagnetic tracker collected in the first experiment (see Figure 6A) and on the right using the same data correctedusing both the distortion compensation and the geometric calibration (Figure 6D) The tripod and laser pointerwere directed toward the projection screen which is shown as the tall white rectangle on the right of each panelThe tracker transmitter is shown as the black square Each circle represents one of the tripod locations with ra-dius equal to the mean gaze point error from that location Mean gaze point error for each location is also shownwith the corresponding circle

TRACKING THE LINE OF PRIMARY GAZE 767

cases in which calibration failed the addition of one ortwo calibration points to the set was enough to reversethe failure In general better fits were obtained from cal-ibration sets with greater variety in sampled tracker ori-entations and screen locations

DISCUSSION

The LoPG-tracking model and calibration techniquespresented here succeed at accurately predicting the on-screen point of gaze while participants look through ahead-mounted sight The validation experiments showthat in agreement with Caldwell et al (2000) the use ofpolynomial distortion compensation can improve the ac-curacy of gaze point calculations (Figures 6C and 6D)In addition benefit was also seen from geometric cali-bration The techniques presented for tracking LoPGprovide a foundation for future gaze-tracking systems

An interesting result of performing calibration bothwith and without distortion compensation was that al-though point of gaze was predicted more accurately withdistortion compensation the tracking system still per-formed relatively well in the presence of uncompensateddistortion provided that geometric calibration was per-formed In the presence of measurement errors causedby distortion the minimization procedure gives modelparameters that may not correspond well to the actualspatial relationships being modeled but interact with the

measurement distortion to produce reasonably accuratescreen intersection predictions If somewhat larger LoPG-tracking errors can be tolerated one might be able toachieve reasonable results by omitting distortion compen-sation altogether Omission of distortion compensationwould likely result in much greater error however for headpositions or orientations outside the range of calibration

Possible sources of gaze point errors remaining whenboth geometric calibration and distortion compensationwere performed include measurement noise inaccura-cies in magnetic tracker measurements not corrected bythe distortion compensation system inaccuracy of themodel parameters found by the geometric calibrationsystem or some combination of these Tests of stabilityin the tracker measurements showed that readings fromthe tracker taken milliseconds or hours apart agreedboth differ from reference measurements by only a fewhundredths of an inch and by a few tenths of a degreeThis small measurement noise is clearly not enough toexplain the gaze point errors which are orders of mag-nitude greater

Our method makes the simplifying assumption thatthe distortions in magnetic tracker orientation readingsdepend on location of the magnetic sensor only and noton the interaction between sensor location and orienta-tion Therefore it is likely that our method did not com-pletely correct for the magnetic distortion When themagnetic sensor is kept in a similar orientation in both

mean across all orderings95 confidence interval

14

12

10

8

6

4

2

02 10 20 30 40 50

Number of data points in calibration set

Mea

n ga

ze p

oint

err

or a

cros

s al

l 100

dat

a po

ints

(in

ches

)

Figure 8 Error in computed point of gaze derived when model was calibrated with in-creasing numbers of calibration points across 100 random orderings of the data set collectedin the first experiment (using a laser pointer) Mean gaze point error was reduced to within2 in (about 2ordm) after about 23 directed gazes

768 BARABAS ET AL

distortion compensation calibration and actual experi-mental use this approach is more likely to produce goodresults In the first experiment our method provided animprovement over omitting orientation distortion com-pensation entirely since sensor alignment was similar incalibration and experiment It is likely however that dis-tortion still plays a role in gaze point errors In the sec-ond experiment we found even lower (than that reportedin the Results section) mean gaze point error when mag-netic tracker measurements were corrected for locationdistortion but not for orientation distortion As the mag-netic tracker was rotated on its side when attached to theheadgear of the head-mounted sight (Figure 2) it was nolonger in the relative orientation needed to see a benefitfrom our compensation for orientation distortion Ex-panding the distortion compensation calibration proce-dure to measure the sensor at multiple orientations foreach grid location sampled could further reduce errordue to distortion This change would increase both thenumber of data points required and the time needed toperform the initial data collection procedure but it wouldnot impact per-participant calibration demands

The geometric calibration algorithm seeks a tOumlB andtSumlE yielding a local minimum in gaze point error start-ing from the initial guess derived from hand measure-ments of these parameters Control experiments wereperformed to verify that no other potentially lower min-ima could be found near the minima arrived at by cali-

bration Although the minimization procedure failed toconverge for intentionally inconsistent initial guessestests conducted using parameters displaced from hand-measured values by several inches in all directions allstill converged on the same LoPG Thus uncorrecteddistortion in magnetic tracker measurements was mostlikely the leading cause of gaze point error

Comparison between points of gaze computed with(Figure 6D) and without (Figure 6C) the geometric cali-bration step shows that attempts to hand measure the spa-tial relationships tOumlB and tSumlE (ie without geometriccalibration) lead to relatively large gaze point errors Ap-plication of the geometric fitting procedure substantiallyimproves the ability of the model to compute LoPG Al-though it is not surprising that better results can be ob-tained when calibration is performed it is interesting tonote that the parameters resulting from calibration deviatedconsiderably from hand-measured parameters In factthe optimized parameters comprising tSumlE indicatedspatial relationships that were clearly inconsistent withthe actual configuration of the laser and magnetic sensorFor example parameters producing the least mean gazepoint error contained values for the distance between themagnetic sensor and the LoPG that were several inchesgreater than physical measurements indicated Althoughthere is some imprecision (we estimate about a quarter ofan inch) in measuring straight-line distance between eyeand magnetic sensor this imprecision alone cannot ex-

mean across all orderings95 confidence interval

14

12

10

8

6

4

2

02 10 20 30 40 50

Number of data points in calibration set

Mea

n ga

ze p

oint

err

or a

cros

s al

l 100

dat

a po

ints

(in

ches

)

Figure 9 Error in computed point of gaze derived when model was calibrated with in-creasing numbers of calibration points across 100 random orderings of the data set collectedin the second experiment (with a human participant) Mean gaze point error was reduced toless than 2 in (about 2ordm) in about 12 directed gazes

TRACKING THE LINE OF PRIMARY GAZE 769

plain the difference Since calibration performed withdata from the human participant seemed less prone tothis inconsistency we suspect that the constraints im-posed on the movement of the laser and sensor by the tri-pod head may be responsible In positioning the laser toalign it with calibration targets the sensor remainedwithin 47ordm of level and ldquorollrdquo of the sensor could not beperformed These constraints lead to less variation in thecalibration data and in turn may have underconstrainedthe fitting procedure used in geometric calibration

A limitation of the geometric calibration procedure de-scribed here is the inability of the calibration procedureto locate the eye along the LoPG Although this limita-tion does not impact the ability of the system to predictpoints of gaze on the screen it does impact the veracityof the model parameters Although the geometric modelused specifies the eye as a coordinate frame with an ori-gin located some distance from the magnetic tracker sen-sor the geometric calibration procedure constrains onlythe x-axis of this coordinate frame This results in an in-finite number of possible sets of model parameters thatstill predict the same point of gaze on the screen (iescreen intersection remains the same even if the eye wereto slide forward or back along the LoPG or to rotate aboutits x-axis) This means that the parameters making uptSumlE found by the fitting procedure correspond to onepossible location of the eye along the LoPG Control ex-periments showed that the fitted location of the eye alongthe LoPG depended on the initial guess used for the fit-ting procedure even though these alternate initialguesses still yielded the same points of gaze5 If the in-termediate results of the model are to be used for exam-ple for computing the location of the eye relative to thescreen for placement of a camera in a virtual environ-ment the inability of this system to precisely locate theeye might be overcome by performing an additional cal-ibration For example a calibration similar to the oneused in this article but requiring the participant to placethe eye in other nonprimary positions of gaze could beused to find actual eye location This technique is cur-rently being investigated

REFERENCES

Allison R S Eizenman M amp Cheung B S K (1996) Combinedhead and eye tracking system for dynamic testing of the vestibularsystem IEEE Transactions on Biomedical Engineering 43 1073-1082

Asai K Osawa N Takahashi H amp Sugimoto Y Y (2000 March)Eye mark pointer in immersive projection display Paper presented atthe IEEE Virtual Reality 2000 Conference New Brunswick NJ

Atchison D A amp Smith G (2000) Optics of the human eye OxfordButterworth-Heinemann

Barabas J Giorgi R G Goldstein R B Apfelbaum H WoodsR L amp Peli E (2003 May) Wide field 3D gaze tracking systemPaper presented at the Association for Research in Vision and Oph-thalmology Meeting Fort Lauderdale FL

Blood E (1990) US Patent No 4945305 Washington DC USPatent amp Trademark Office

Bryson S (1992 February) Measurement and calibration of staticdistortion of position data from 3D trackers Paper presented at SPIEStereoscopic Displays amp Applications III San Jose

Caldwell R T Berbaum K S amp Borah J (2000) Correcting er-rors in eye-position data arising from the distortion of magnetic fieldsby display devices Behavior Research Methods Instruments ampComputers 32 572-578

Day J S Dumas G A amp Murdoch D J (1998) Evaluation of along-range transmitter for use with a magnetic tracking device in mo-tion analysis Journal of Biomechanics 31 957-961

Duchowski A Medlin E Cournia N Murphy H Gramopad-hye A Nair S Vorah J amp Melloy B (2002) 3-D eye move-ment analysis Behavior Research Methods Instruments amp Comput-ers 34 573-591

Ikits M Brederson J D Hansen C D amp Hollerbach J M(2001 March) An improved calibration framework for electromag-netic tracking devices Paper presented at the Virtual Reality 2001Proceedings IEEE Yokohama

Janin A Mizell D amp Caudell T (1993 September) Calibrationof head-mounted displays for augmented reality applications Paperpresented at the Virtual Reality Annual International SymposiumSeattle

Kindratenko V (1999) Calibration of electromagnetic tracking de-vices Virtual Reality Research Development amp Applications 4139-150

Kindratenko V (2000) A survey of electromagnetic position trackercalibration techniques Virtual Reality Research Development ampApplications 5 169-182

King P (1995) Integration of helmet-mounted displays into tacticalaircraft Proceedings of the Society for Information Display 26 663-668

Kuipers J B (1999) Quaternions and rotation sequences A primerwith applications to orbits aerospace and virtual reality PrincetonNJ Princeton University Press

Leigh R J amp Zee D S (1983) The neurology of eye movementsPhiladelphia Davis

Livingston M A amp State A (1997) Magnetic tracker calibrationfor improved augmented reality registration Presence Teleoperatorsamp Virtual Environments 6 532-546

Nixon M A McCallum B C Fright W R amp Price N B (1998)The effects of metals and interfering fields on electromagnetic track-ers Presence 7 204-218

Southard D A (1995) Viewing model for virtual environment dis-plays Journal of Electronic Imaging 4 413-420

Young L R amp Sheena D (1975) Survey of eye movement recordingmethods Behavior Research Methods amp Instrumentation 7 397-429

NOTES

1 This position is used in projected virtual reality displays to deter-mine how to position the virtual camera when generating an image of ascene

2 Even if the tracker is not aligned exactly with the case of the trans-mitter the correction polynomials (Equations 1 and 2) we used includeterms to account for such an offset These offsets are contained in theconstant (i 0) and linear (i 1) terms of polynomials

3 Two linear systems were constructed from 192 six-variable mea-surements Each of the two systems contained 192 3 576 knownsand 105 unknowns

4 It should be noted that the calibration procedure described heredoes not completely constrain the model parameters used Since no mea-surement of orientation on the screen of the point of gaze or distancefrom eye to screen during a sighting is made neither of these can be pre-dicted by the model As a result the best-fitting model parameters de-scribing tSumlE may converge on any one of a line of solutions corre-sponding to possible locations of the eye along the LoPG Each of theseeye locations predicts the same LoPG location on the screen It shouldbe possible to compute correct eye position with a calibration proce-dure that includes eye-in-head rotation

5 For this reason we have chosen to report errors in inches along thescreen surface since angular measures of error are more sensitive to lo-cation of the eye along the LoPG Where angular error is reported weestimated the eyendashscreen distance to be the distance between the mag-netic sensor and the screen

770 BARABAS ET AL

ARCHIVED MATERIALS

The following materials associated with this article are retrievablefrom the Psychonomic Societyrsquos Norms Stimuli and Data archivehttpwwwpsychonomicorgarchive

To access the above files or links search the archive for this articleusing the journal (Behavior Research Methods Instruments amp Com-puters) the first authorrsquos name (Barabas) and the publication year(2004)

FILE Barabas-BRMIC-2004zipDESCRIPTION The compressed archive file contains seven files The

files areREADMEtxt containing a description of scripts and data included

in the archive (a 3K text file) This file can renamed to readmem andexecuted as a Matlab script to reproduce some of the results reported inthis manuscript

barabas2004LoPGmat a Matlab workspace file containing mag-netic tracker data collected in the verification experiments described inthis manuscript (a 32K binary file)

findcorrectioncoeffsm a Matlab function for finding the distortioncompensation coefficients for Equations 1 and 2 (a 5K Matlab M-file)

correctdistortionm a Matlab function for performing distortioncompensation on magnetic tracker data using Equations 1 and 2 (a 4KMatlab M-file)

findmodelparamsm a Matlab function for finding the parameters ofthe LoPG model from data collected during a series of directed gazes (a3K Matlab M-file)

intersecterrorm a Matlab function for finding gaze point error for aset of directed gazes (a 5K Matlab M-file)

trackereulerstomtx4m a Matlab function that computes a transfor-mation matrix representation of a relationship between coordinateframes via Equation 4 (a 3K Matlab M-file)

AUTHORrsquoS WEB SITE httpwwweriharvardedufacultypeliindexhtml

AUTHORrsquoS E-MAIL ADDRESS barabasmitedu

(Manuscript received July 18 2003revision accepted for publication July 23 2004)

758 BARABAS ET AL

the sighting mark this locates the line of sight The cen-ter of rotation of the eye lies approximately along the lineof sight so that the fixation axis (line between the cen-ter of rotation and the fixated target) is a close approxi-mation of the line of sight (Atchison amp Smith 2000)When the sighting mark is placed so that the participantrsquoseye is in the primary position of gaze while aligning thesight and the target this is the LoPG Thus in our sys-tem the LoPG connects the center of rotation of the par-ticipantrsquos left eye and a point of regard

Several calibration challenges must be overcome foraccurate tracking of the LoPG Our walking simulatoruses a magnetic tracker (Blood 1990) to measure loca-tion and orientation of the participantrsquos head This typeof device also used in biomechanics (Day Dumas ampMurdoch 1998) virtual reality (Livingston amp State1997) and eye-tracking research (Caldwell Berbaum ampBorah 2000) is subject to environmental magnetic dis-tortions that worsen with the distance between the mag-netic transmitter and the magnetic sensor Compensationfor this distortion is required if measurements are to betaken from near the limits of the trackerrsquos range or if thetracker is to be used in close proximity to large metallicobjects (Nixon McCallum Fright amp Price 1998)

Several techniques for compensating for this kind ofdistortion have been proposed and a summary of thesemethods has been compiled by Kindratenko (2000) Mostof the techniques reviewed by Kindratenko were developedfor virtual reality (allowing for presentation of motionparallax effects as viewers move their heads) but are ap-plicable to tracking the LoPG These distortion compen-sation techniques generally require custom equipmentand are not included in popular commercial tracking sys-tems For example in a study in which the eye move-ments of radiologists were investigated Caldwell et al(2000) used an ASL eye-tracking system but they em-ployed a modified version of the ASL EYEPOS softwareand a custom calibration fixture to correct for magneticdistortion This method allowed for more accurate gazetracking near a metallic device but few details of thesoftware modifications were published

In addition to compensating for magnetic distortionone must also find the alignment between the physicalcomponents of the tracking system The distances be-tween these components and their relative orientationsbecome parameters in the LoPG-tracking model Al-though direct measurement of these parameters is possi-ble small errors in these measurements can lead to largeerrors in computed LoPG Developers of augmented re-ality systems (where visual information is overlaid on anobserverrsquos view using a see-through head-mounted dis-play) have explored geometric models and calibrationtechniques like those used in eye tracking Calibration isrequired for aligning overlaid images with real-world ob-jects Janin Mizell and Caudell (1993) proposed cali-brating the location of information presented in a head-mounted display by aligning targets projected in thedisplay with markers on a real-world workbench in a se-ries of intentional gazes We use a similar approach forLoPG calibration

Our system allows for accurate (mean error of lessthan 2ordm) tracking of LoPG over a wide range of head lo-cations and orientations We compute the LoPG using amodel of the geometry of gazes with a few simplifyingassumptions Doing so allows model parameters in theLoPG-tracking algorithm to correspond to actual distancesand angles between parts of the tracking system makingdirect verification possible The general approach of thismethod can also be extended to incorporate additional oralternate tracking systems

Figure 1 Experimental configuration of the walking simulatorshowing the locations of the walking platform the wide projec-tion screen and the magnetic tracker transmitter

Figure 2 Head-mounted sight The participant aligned objects of interest with asighting mark When viewing objects through the mark the left eye was held in theprimary position of gaze

TRACKING THE LINE OF PRIMARY GAZE 759

In this article we first will describe our algorithms forcompensating for magnetic distortion and for finding theLoPG from compensated tracker measurements Thenwe will describe our calibration methodology for findingparameters for the above procedures Finally we will de-scribe and present the results of two experiments used tovalidate our method

METHOD

ApparatusThe LoPG tracking system we used to validate our

tracking algorithm and calibration techniques was builtas a part of our walking simulator It consisted of the fol-lowing

Projection screen A 67 50 in rear projection screenwas used for presentation of calibration targets The bot-tom of the screen was elevated 30 in above our walkingplatform (see Figure 1) to place its center near eye levelfor walking participants

Magnetic tracker An Ascension Flock-of-Birds mag-netic tracker with an extended range transmitter (AscensionTechnology Corporation Burlington VT) was used tomonitor head position This system reports location andorientation of a small magnetic sensor in relation to atransmitter Each measurement consists of six variablesthree representing location in three dimensions and threeEuler angles representing orientation The tracker trans-mitter was mounted on a 48-in-tall wood stand and wasplaced 52 in to the right and 36 in in front of the pro-jection screen center The tracker sensor was mounted onadjustable headgear worn by the participant (Figure 2)The headgear also carried the head-mounted sight

Head-mounted sight A head-mounted sight wasused to allow the participants to consistently direct theirLoPG at projected screen targets The sight was a 18-inhollow square mark on a clear plastic panel which wasin turn mounted on a 10-in lightweight brass boom ex-tending from the headgear

Computer A Pentium computer running MicrosoftWindows 2000 Matlab 65 (The MathWorks Inc NatickMA) and custom software was used to generate screenimages of calibration targets to control and record datafrom the magnetic tracker and to perform calibrationand gaze computations Calibration software was writ-ten using Matlab and is available for download from thePsychonomic Society Web archive

LoPG Tracking AlgorithmWe developed an algorithm to transform raw magnetic

tracker measurements into experimental variables of in-terest for behavior research The algorithm consists ofthree steps distortion compensation geometric trans-formation and computation of the point of gaze Distor-tion compensation consists of correcting raw magnetictracker readings to compensate for the spatial distortionsthat are characteristic of these devices In this step weattempt to compute the true location and orientation of

the magnetic sensor from its reported location and ori-entation In the second step (geometric transformation)distortion-compensated measurements are used to findthe LoPG relative to the projection screen Finally theLoPG is used to compute the point on the screen alignedwith the sighting mark We call this on-screen locationthe point of gaze The first and second steps of this al-gorithm rely on parameters that are obtained through apair of calibration processes described below in the Cal-ibration section

Distortion compensation Examination of readingsfrom our magnetic tracker confirmed other research(Bryson 1992) that has shown that distortion in both thelocation and the orientation components of measure-ments varied dramatically depending on location of themagnetic sensor To account for these distortions we usea compensation function that adds a correction factor tomeasurements This correction depends on the locationportion of magnetic sensor readings Specifically tocompensate for distortion the translational componentsof each raw measurement from the magnetic tracker (xoyo and zo representing the location in three dimensionsof the magnetic sensor relative to the transmitter) aretransformed with a polynomial distortion correctionfunction (Kindratenko 1999) The correction consists ofthree degree-n polynomials in three variables Each ofthese three polynomials contains terms consisting of theproduct of xo yo and zo raised to powers in all combi-nations where the sum of the exponents is n or fewer

(1)

where aijk bijk and cijk are t (n1)(n2)(n3)6 poly-nomial coefficients (obtained in the calibration process)and xc yc and zc consist of the coordinates of the cor-rected measurement Orientation of the sensor reportedby the magnetic tracker as Euler angles qo fo and yo iscorrected using a similar set of polynomials

(2)

where dijk eijk and fijk are t polynomial coefficients andqc fc and yc are corrected orientation angles Correct-ing for distortion in this way assumes that the distortionin orientation measurements depends only on the loca-tion of the magnetic sensor and not at all on orientationAlthough distortion in orientation measurements doesindeed vary with sensor orientation (Livingston amp State1997) this form of distortion compensation is still ef-fective if the sensor remains in the same relative orien-tation in both calibration and experimental use Polyno-mial coefficients in Equations 1 and 2 correspond to theshape of the specific distortion in a given experimental

qf

y

qf

y

c

c

c

o

o

o

o o o

Egrave

Icirc

IacuteIacuteIacute

˘

˚

˙˙˙

=Egrave

Icirc

IacuteIacuteIacute

˘

˚

˙˙˙

+Egrave

Icirc

IacuteIacuteIacute

˘

˚

˙˙˙

loz loz - -

=

-

==AcircAcircAcirc

d

e

f

x y zijk

ijk

ijk

j k i j k

k

i j

j

i

i

n

000

x

y

z

x

y

z

a

b

c

x y zijk

ijk

ijk

j k i j k

k

i j

j

i

i

nc

c

c

o

o

o

o o o

Egrave

Icirc

IacuteIacuteIacute

˘

˚

˙˙˙

=Egrave

Icirc

IacuteIacuteIacute

˘

˚

˙˙˙

+Egrave

Icirc

IacuteIacuteIacute

˘

˚

˙˙˙

loz loz - -

=

-

==AcircAcircAcirc

000

760 BARABAS ET AL

configuration and can be computed using the calibrationprocedure described below In agreement with the find-ings of Kindratenko (2000) we found that a polynomialof degree n 4 modeled the shape of the distortion wellA fourth-degree polynomial required t 35 coefficientsfor each tracker variable (210 in total)

Geometric transformation The second computa-tional step of the LoPG tracking algorithm transformsthe distortion-compensated magnetic tracker readings tolocate LoPG in relation to the screen This transforma-tion employs a geometric model of the spatial relation-ship between the eye and the magnetic sensor and of therelationship between the screen and the magnetic trans-mitter (Figure 3) The set of 12 model parameters com-puted in the geometric calibration procedure (describedbelow) characterizes these two spatial relationships

The model takes each of the following locations withinthe tracking environment to be the origin of a three-dimensional coordinate frame the upper left corner ofthe screen (coordinate frame O) the measurement originof the magnetic transmitter (B) the measurement centerof the magnetic sensor mounted on the head (S) and the

center of rotation of the eye (E) The x y and z axes ofeach coordinate frame are oriented as shown in Figure 3

We represent the spatial relationships between pairs ofthese coordinate frames as 4 4 homogeneous trans-formation matrices Each such matrix can describe therelative locations and orientations of two coordinateframes and allows measurements in one frame to betransformed to the other by matrix multiplication Foreach such matrix

(3)

representing the transformation from some frame a to someframe b xba yba and zba describe the three-dimensionallocation of the origin of a in the coordinate system of bThe r variables are the lengths of the projections of unitvectors along each of the three coordinate axes of framea onto the axes of frame b These projections form therotational component of the transformation

tb auml

=

Egrave

Icirc

IacuteIacuteIacuteIacute

˘

˚

˙˙˙˙

r r r x

r r r y

r r r z

ba

ba

ba

11 12 13

21 22 23

31 32 33

0 0 0 1

Figure 3 Geometric model for finding line of primary gaze by combining (via matrix multiplication) thespatial relationship measured by the magnetic tracker (ttBumlS corrected for magnetic distortion) with thosederived through geometric calibration (ttOumlB and ttSumlE) B is the tracker transmitter S is the tracker sen-sor E is the eye and O is the projection screen

TRACKING THE LINE OF PRIMARY GAZE 761

In addition to the matrix representation of the relation-ship between two coordinate frames it is often conve-nient to use a translation and Euler angle representationIn this way a transformation can be described compactlywith 6 variables instead of the 12 used in the matrix formThe three coordinates of translation and the three Eulerangles that specify a transformation Mbumla (xba yba zbaqba fba yba) can be converted to matrix form by a se-ries of three rotations and a displacement tbumla as shownin Equation 4 at the bottom of the page (Kuipers 1999)

Using these conversions the (corrected) measure-ments provided by the magnetic tracker can be used toderive the other spatial relationships described in Fig-ure 3 The computations of the various transformationsare described next followed by a description of the cal-ibration process

Since the LoPG lies along the x-axis of the eye frameE to find the LoPG relative to the screen we find the lo-cation and orientation of this frame with respect to thescreen frame O To find this relationship we composethe matrix representations of the spatial relationshipsmeasured by the magnetic tracker with matrices com-puted through calibration The position of the magnetictransmitter with respect to the screen tOumlB (transforma-tion from transmitter frame to screen frame) and that ofthe eye with respect to the sensor tSumlE remain fixed forthe duration of an experiment The transformation tBumlSis constructed using Equation 4 from the six variablesMBumlS (xc yc zc qc fc yc) resulting from the distortioncompensation step (Equations 1 and 2) The coordinatesxc yc and zc represent the corrected three-dimensionaldisplacement between the origins of B and S while qcfc and yc are the corrected Euler angles describing theorientation of S in the coordinate system of B

Since composite transformations can be representedas the product of transformation matrices we can findthe location and orientation of the eye relative to thescreen tOumlE as follows

(5)

Once tOumlE is computed we can find both the eye loca-tion with respect to the screen1 (simply the last columnof tOumlE) and the point of intersection between the LoPGand the screen

Point of gaze computation The point on the screenthat intersects the participantrsquos LoPG can be found fromtOumlE the result of the geometric transformation portionof the gaze-tracking algorithm In our model the LoPGfalls along the x-axis of the eye coordinate frame E Wesolve for a point that is on the x-axis of the eye frame and

also falls in the xndashy plane of the screen coordinate frameOmdashthe surface of the screen This intersection can beformulated as a linear system consisting of a point on thex-axis of frame E (xo 0 0) transformed by tOumlE and setequal to a point in the xndashy plane of frame O (Intx Inty 0)

(6)

Solving this system for Intx and Inty gives us the inter-section between the LoPG and the plane of the screenexpressed in the screen coordinate frame

CalibrationParameters used in the LoPG tracking algorithm are

computed via two calibration steps First the distortionin magnetic tracker readings is measured and coeffi-cients for the compensation polynomials (Equations 1and 2) are computed This distortion compensation cal-ibration process was carried out when our tracking sys-tem was installed Computed coefficients can be reusedassuming that there have been no changes to the relativeplacement of the tracker transmitter and the distortion-causing (large metal) objects The second geometric cal-ibration step computes the spatial relationship betweenthe magnetic tracker sensor and the eye along with therelationship between the tracker transmitter and the screenimage This second step is performed once for each par-ticipant before an experimental run since an adjustment ofthe headgear or of the image projected on the screen al-ters the spatial relationships measured in this calibration

Distortion compensation calibration To computecoefficients aijk bijk cijk dijk eijk and fijk of the distortioncompensation polynomials (Equations 1 and 2) we com-pare a set of tracker readings taken over a grid of knownpoints within the tracking volume to the hand-measuredlocations of those points The differences between themeasured and the reported locations form a set of vectorsmapping points in reported space to actual physical lo-cations (these difference vectors were small near thetransmitter and became largemdashsee Figure 5mdashnear theouter third of the nominal volume of our trackerrsquos range)We then solve for the set of polynomial coefficients thatwhen used in Equation 1 minimizes the sum of the squaredlengths of these error vectors An identical procedure isperformed on orientation data finding coefficients thatminimize the sum of the squared errors in Euler anglesreported by the tracker

tO E

o

uml

Egrave

Icirc

IacuteIacuteIacuteIacute

˘

˚

˙˙˙˙

=

Egrave

Icirc

IacuteIacuteIacuteIacute

˘

˚

˙˙˙˙

x Int

Intx

y0

0

1

0

1

O E O B B S S Euml uml uml uml=t t t t

(4)b auml

=

Egrave

Icirc

IacuteIacuteIacuteIacute

˘

˚

˙˙˙˙

-Egrave

Icirc

IacuteIacuteIacuteIacute

˘

˚

˙˙˙˙

-

Egrave

Icirc

IacuteIacuteIacuteIacute

˘

˚

˙t

q qq q

f f

f f

1 0 0

0 1 0

0 0 1

0 0 0 1

0 0

0 0

0 0 1 0

0 0 0 1

0 0

0 1 0 0

0 0

0 0 0 1

x

y

z

ba

ba

ba

ba ba

ba ba

ba ba

ba ba

cos( ) sin( )

sin( ) cos( )

cos( ) sin( )

sin( ) cos( )˙˙˙

-Egrave

Icirc

IacuteIacuteIacuteIacute

˘

˚

˙˙˙˙

1 0 0 0

0 0

0 0

0 0 0 1

cos( ) sin( )

sin( ) cos( )

y yy y

ba ba

ba ba

762 BARABAS ET AL

In our walking simulator environment we constructeda system of pegboards and stands to provide a set ofknown locations for placement of the magnetic sensorduring this procedure The pegboard system illustratedin Figure 4 was constructed from 24 48 in sheets ofpredrilled 025-in-thick pegboard mounted on a woodenframe Screws used in the construction of the frame werereplaced with glue since we found them to be causingsmall local distortions in magnetic tracker readings Thesensor stands were constructed from sections of 4-in-diameter PVC pipe with flat pegboard caps on each endEach stand held the magnetic tracker sensor rigidly onone end cap and had pegs for attachment to the pegboardon the other The sensor was positioned so that when thestands were engaged with the pegboard the sensorrsquosx-axis (see Figure 4) pointed toward the screen and itsz-axis pointed toward the floor In this orientation themagnetic tracker reported azimuth elevation and roll ofthe sensor (qo fo and yo) to all be near zero (these an-gles deviated from zero by a few degrees due to distor-tion) Several pipe lengths were used for measurementsat varying heights (0 6 12 and 18 in) above the peg-board plane Other devices for placing a sensor at knownlocations have also been developed including a fixturemade entirely of Plexiglas (Caldwell et al 2000) and anoptical tracking system (Ikits Brederson Hansen ampHollerbach 2001)

The pegboard frame was placed in our tracking envi-ronment and was mechanically squared with the case ofthe magnetic transmitter The location of the pegboard

grid with respect to the magnetic transmitter was thenmeasured2 Readings from the magnetic tracker weretaken over both a sparse three-dimensional grid of 64locations spanning 45 52 18 in (see Figure 5) anda denser smaller grid of 128 locations covering the cen-tral 32 40 18 in of the same space The magnetictracker readings from both grids were combined to formthe data set of reported locations and orientations UsingMatlab we performed linear least-squares fits on the dif-ferences between reported and pegboard-measured datasets to find the 210 coefficients for the distortion cor-rection polynomials3 An illustration of the magnitude oflocation distortion for our tracker is provided in Figure 5

Geometric calibration The second calibration stepcomputes the geometric model parameters describingspatial relationships tOumlB and tSumlE These parameterscorresponding to the alignment of the screen with thetracker transmitter (tOumlB) and position of the magnetictracker sensor relative to the eye (tSumlE) are difficult tomeasure directly and tSumlE changes from participant toparticipant or whenever the headgear is adjusted Forthese reasons we developed a calibration procedure toquickly ascertain these model parameters

Geometric calibration of the LoPG-tracking system isperformed by taking magnetic tracker recordings whileparticipants perform a series of constrained gazes Foreach gaze the participant using his or her left eye alignsthe head-mounted sight with a target dot projected on thescreen The participant repeats this procedure from a se-ries of standing locations within the desired usage space

Figure 4 Pegboard apparatus for measuring distortion in magnetic tracker measure-ments Two such pegboard panels were mounted on a wooden frame to calibrate a volumemeasuring 45 52 18 in Stands of various heights were placed at different locations acrossthe base pegboard to provide wide coverage of the volume

TRACKING THE LINE OF PRIMARY GAZE 763

After a number of such gazes a pair of tOumlB and tSumlEparameters is computed by the GaussndashNewton nonlinearleast-squares fitting method as will be described in thenext section

Geometric calibration algorithm The fitting proce-dure seeks the set of model parameters that when com-bined with the geometric model in our LoPG-trackingalgorithm best predict on-screen points of gaze fromdistortion-compensated magnetic tracker readings takenover a set of directed gazes Our model computes Int thetwo-dimensional on-screen point of gaze expressed as

Int F(P MBumlS) (7)

where F is the function that computes point of gaze froma given (corrected) magnetic tracker reading and a set ofmodel parameters (Equations 5 and 6) P is the set of 12model parameters characterizing via Equation 4 thespatial relationships tOumlB and tSumlE (the 12 parametersdescribe for each of the two transformations a three-dimensional displacement between coordinate framesand three Euler angles describing relative orientations ofthe coordinate frames) MBumlS is the magnetic trackermeasurement (three distances and three angles) cor-

rected for distortion by Equations 1 and 2 We assessquality of the fit found by the geometric calibration pro-cedure by the size of the average error across all gazes ina calibration This mean gaze point error is expressed asthe sum across all calibration gazes of the distancesalong the screen between the actual calibration targetsand the model-predicted points of gaze The fitting pro-cedure seeks model parameters P that minimize thissum given a set of magnetic tracker measurements MBumlSa set of corresponding screen intersections Int and aninitial ldquoguessrdquo for P Starting with the given guess for Pthe fitting procedure iteratively adjusts these parametersuntil a local minimum in mean gaze point error is found4Minimization is performed in Matlab using the nlinfitfunction (from the statistics toolbox an implementationof the GaussndashNewton fitting method) In our implemen-tation this technique finds a P corresponding to a givenset of 20 measurements of Int and MBumlS in a few sec-onds on a 700-MHz PC

Experimental ValidationTo test the calibration and LoPG-tracking techniques

presented in this article we first calibrated the distortion

y coordinate of magnetic transmitter frame (inches)x coordinate of magnetic transmitter frame (inches)

z coordinate of magnetic transm

itter frame

Actual shape ofmeasured space

Reported shape of measuredspace due to distortion

Projection

Screen

ndash90ndash80

ndash70ndash60

ndash50ndash40

ndash30ndash20

ndash100

1040

30

20

10

0

ndash10

ndash20

ndash30ndash40

ndash30

ndash20

ndash10

0

10

20

Magnetic TrackerTransmitter

Figure 5 Distortion measured in performing distortion compensation calibration of magnetic trackerAxis units are inches in the coordinate system of the magnetic tracker transmitter Dark bars on axes in-dicate the ranges of physical locations sampled in the calibration procedure Shade of surface indicatesmagnitude of distortion in location measurements from the magnetic tracker The darkest areas indicatedistortion in location measurement of over 13 in

764 BARABAS ET AL

compensation system as described above and then per-formed two target-sighting experiments In the first ex-periment we performed the geometric calibration pro-cedure using the beam of a laser pointer to represent theline of gaze of the human participant Doing so allowedthe simulated LoPG to be precisely aligned with calibra-tion targets projected on the screen and allowed testingwithout the variability of human participantsrsquo ability tomaintain head position To fix the relative placement ofthe laser and sensor both were attached to a small woodenblock The sensor was placed on the block with its x-axisapproximately parallel to and 2 in above the laser beamThis alignment enabled us to easily measure the actualspatial relationship between the sensor and the laser forvalidation of the parameters found by the geometric cali-bration procedure (We found it much more difficult tomeasure the equivalent relationship tSumlE on a humanparticipant) The wood block holding the sensor and laserwas fastened to an aluminum and plastic tripod raised tothe approximate height of a human participantrsquos head Togather data for testing the geometric calibration proce-dure the tripod was placed at 10 different locations Theselocations were chosen within the range of possible stand-ing positions of our study participants (an area about 36 36 in) For each of the 10 tripod locations readings fromthe magnetic tracker were taken panning and tilting thetripod head so that the laser beam fell on each calibrationtarget Four of the targets were just inside the corners ofthe screen one was placed in the center and five other lo-cations were arbitrarily chosen The targets remained inthe same locations throughout each experiment Due tothe distance between the sensor and the pivot point of thetripod head for a given tripod location the actual magneticsensor locations varied by up to 8 in as the tripod headwas tilted at different angles to align the laser

In a second experiment a similar set of 100 directedgazes was performed by a human participant The par-ticipant used the head-mounted sight (Figure 2) to alignhis LoPG with each of the 10 targets from each of 10standing locations The 5 calibration targets that were ar-bitrarily placed in the first experiment were in differentlocations for calibration of the human participant

Data from both experiments were independently usedto calibrate the LoPG-tracking system and the benefitof distortion compensation and geometric calibrationprocedures were examined For the distortion compen-sation step we used polynomial coefficients computedfrom the 192 pegboard locations as described above Inall cases we used as an initial guess for the geometriccalibration procedure a set of hand-measured parametersthat approximated the actual geometry of the system

RESULTS

For each of the two experiments collected data con-sisted of a set of 100 six-variable tracker readings (eachreading taken when the laserLoPG was aligned to fall atthe center of an on-screen calibration target) as well as

the 100 two-dimensional screen locations of the corre-sponding calibration targets (10 repetitions of 10 tar-gets) Location and orientation data from the magnetictracker were passed through the distortion compensationfunctions (Equations 1 and 2) yielding a second set of100 corrected tracker readings for each experiment Weused subsets of this body of distortion-corrected data toderive the transformation matrices tOumlB and tSumlE usingthe geometric calibration procedure described aboveand tested these fits against the remaining data We com-pared known screenndashtarget locations to point-of-gazecomputations made with and without distortion com-pensation or geometric calibration We also investigatedthe behavior of the error as a function of the number ofincluded calibration points by varying the size of the cal-ibration data sets used to compute tOumlB and tSumlE

Accuracy of Predicted Point of GazeUsing the distortion-corrected location and orientation

measurements from the entire set of 100 laserndashtargetalignments from the first experiment and the geometriccalibration procedure the tOumlB and tSumlE matrices werecomputed Equations 5 and 6 were then used to predictpoint of gaze for each of the 100 laserndashtarget alignmentsand mean gaze point error (mean distance along the screenbetween calibration target and point of gaze found withthe LoPG-tracking algorithm) was computed For thedata set from the first experiment mean error was 150 in(Figure 6D) Error can also be described as the angle be-tween the screen target and the computed point of gazewhen viewed from the location of the magnetic sensorMeasured in this way mean angular error was 155ordm Forreasons described in the Discussion section we will re-port the remainder of our mean error results in inches Atour screenndasheye distances 1ordm corresponds approximatelyto 1 in of error

To examine the effect of distortion compensation onmean gaze point error we also performed the geometriccalibration procedure using raw magnetic tracker mea-surements not processed by the distortion correctionpolynomials As seen in Figure 6B without distortioncompensation mean gaze point error was significantlylarger than with compensation (Wilcoxon signed rankstest Z 366 p 001) at 193 in when calibratedusing data from the same experiment (an increase of 29)Without distortion compensation clear outliers from theclusters of points of gaze can be seen for most of thescreen targets in Figure 6B These outlying points ofgaze were all from the tripod location most distant fromthe magnetic transmitter where distortion was largestDistortion compensation provided most benefit for thistripod location

To examine the effect of the geometric calibration pro-cedure on accuracy of predicted LoPG we also com-puted mean gaze point error across the collected data setusing only a hand-measured estimate of tOumlB and tSumlEDistortion correction was included but no geometriccalibration was performed As one might expect without

TRACKING THE LINE OF PRIMARY GAZE 765

calibration errors were larger and this difference wassignificant (Wilcoxon signed ranks test Z 809 p 001) Mean gaze point error using an uncalibrated modelwas 396 in as can be seen in Figure 6C

We also examined the ability of the geometric model tocompute points of gaze without both distortion compensa-tion and geometric calibration The hand-measured tOumlBand tSumlE and raw magnetic tracker readings were used inEquations 5 and 6 Mean gaze point error was 566 in (Fig-ure 6A) The addition of either geometric calibration ordistortion compensation yielded a significant reduction inerror (Wilcoxon signed ranks test Z 61 p 001)

Conducting the same analysis for the data from thesecond experiment we found mean gaze point error tobe lower than that in the first experiment when geomet-ric calibration was performed When both distortioncompensation and geometric calibration were used meanerror was 072 in When geometric calibration was per-formed on data not corrected for distortion mean error was106 in (Wilcoxon signed rank test Z 520 p 001)Reasons for the reduced errors in the second experiment

are proposed in the Discussion section Without geometriccalibration error was much larger (Wilcoxon signed ranktest Z 81 p 001) since we were unable to measuretSumlE with the same accuracy for the second experimentWith and without distortion correction mean errors were398 and 426 in respectively

Effect of sensor location on error Using the rawdata (not compensated for distortion) from the first ex-periment and computing point of gaze without perform-ing geometric calibration the farther the sensor wasfrom the transmitter the worse the error (Spearman cor-relation r 46 p 001) and that error was manifestas an increasing error in the y-dimension (Spearman cor-relation r 50 p 001 Figure 7A) When both thedistortion correction and the geometric calibration wereimplemented the errors were more evenly distributed(Figure 7B) resulting in no significant correlation withdistance from the transmitter (Spearman correlation r 04 p 71) There was however a significant tendencyfor the errors to increase toward the screen (ie in thex-dimension Spearman correlation r 22 p 003)

Figure 6 Effects of distortion compensation (panels C and D) and geometric calibration (panels B and D) oncomputation of points of gaze Data were collected with a tripod-mounted laser pointer used to simulate line ofprimary gaze (LoPG) Circular marks indicate screen location of calibration targets Crosses are laser screen in-tersections (points of gaze) computed by LoPG-tracking algorithm Within each cluster each cross represents ameasurement from a different tripod location Rectangular outline shows location of projection screen Screenoutline and targets appear at hand-measured location (panels A and C) and location found by geometric cali-bration (panels B and D)

766 BARABAS ET AL

This may be because for tripod locations closer to thescreen an angular error (due to distortion for example)of the same size results in a larger error distance alongthe screen

Short calibration In order to use our LoPG-trackingsystem as a tool for gathering data from untrained studyparticipants it is advantageous to calibrate the systemusing as few data points as necessary (eg to reduce fa-tigue) To find the number of calibration data pointsneeded to accurately predict point of gaze we createdcalibration sets from subsets of the 100 measurementscollected in each experiment We used these small cali-bration sets to compute tOumlB and tSumlE and then usedthose computed parameters to calculate screen intersec-tions across the full set of 100 measurements We thencompared these computed intersections with the loca-tions of the actual locations of the on-screen targets Weperformed this technique for calibration sets of varyingsizes across 100 random orderings of the 100 data pointscollected in each experiment For the first experimentwe found that after about 20 points the geometric cali-

bration system converges to a point of gaze with a meangaze point error of less than 2 in with only small im-provements attained by including additional points in thecalibration set (Figure 8)

For the second experiment overall gaze point errorwas lower when calibration was performed with the en-tire data set For this data set 2 in of error could bereached with about 12 data points (Figure 9) Calibra-tions performed with more than 15 data points providedonly a small additional benefit

In the processing of calibration data from both exper-iments geometric calibration occasionally failed com-pletely producing larger mean gaze point error than ifgeometric calibration had not been performed at allThese failures occurred with very small calibration setswhen distortion-related errors provided inconsistent con-straints for the model Failures also occurred when setsincluded targets from only one part of the screen insuf-ficiently constraining the model Fitting failures neveroccurred with sets of 10 points or more and never occurredwhen distortion compensation was performed In all the

Figure 7 The mean gaze point error at each of the 10 tripod locations on the left using the raw data from themagnetic tracker collected in the first experiment (see Figure 6A) and on the right using the same data correctedusing both the distortion compensation and the geometric calibration (Figure 6D) The tripod and laser pointerwere directed toward the projection screen which is shown as the tall white rectangle on the right of each panelThe tracker transmitter is shown as the black square Each circle represents one of the tripod locations with ra-dius equal to the mean gaze point error from that location Mean gaze point error for each location is also shownwith the corresponding circle

TRACKING THE LINE OF PRIMARY GAZE 767

cases in which calibration failed the addition of one ortwo calibration points to the set was enough to reversethe failure In general better fits were obtained from cal-ibration sets with greater variety in sampled tracker ori-entations and screen locations

DISCUSSION

The LoPG-tracking model and calibration techniquespresented here succeed at accurately predicting the on-screen point of gaze while participants look through ahead-mounted sight The validation experiments showthat in agreement with Caldwell et al (2000) the use ofpolynomial distortion compensation can improve the ac-curacy of gaze point calculations (Figures 6C and 6D)In addition benefit was also seen from geometric cali-bration The techniques presented for tracking LoPGprovide a foundation for future gaze-tracking systems

An interesting result of performing calibration bothwith and without distortion compensation was that al-though point of gaze was predicted more accurately withdistortion compensation the tracking system still per-formed relatively well in the presence of uncompensateddistortion provided that geometric calibration was per-formed In the presence of measurement errors causedby distortion the minimization procedure gives modelparameters that may not correspond well to the actualspatial relationships being modeled but interact with the

measurement distortion to produce reasonably accuratescreen intersection predictions If somewhat larger LoPG-tracking errors can be tolerated one might be able toachieve reasonable results by omitting distortion compen-sation altogether Omission of distortion compensationwould likely result in much greater error however for headpositions or orientations outside the range of calibration

Possible sources of gaze point errors remaining whenboth geometric calibration and distortion compensationwere performed include measurement noise inaccura-cies in magnetic tracker measurements not corrected bythe distortion compensation system inaccuracy of themodel parameters found by the geometric calibrationsystem or some combination of these Tests of stabilityin the tracker measurements showed that readings fromthe tracker taken milliseconds or hours apart agreedboth differ from reference measurements by only a fewhundredths of an inch and by a few tenths of a degreeThis small measurement noise is clearly not enough toexplain the gaze point errors which are orders of mag-nitude greater

Our method makes the simplifying assumption thatthe distortions in magnetic tracker orientation readingsdepend on location of the magnetic sensor only and noton the interaction between sensor location and orienta-tion Therefore it is likely that our method did not com-pletely correct for the magnetic distortion When themagnetic sensor is kept in a similar orientation in both

mean across all orderings95 confidence interval

14

12

10

8

6

4

2

02 10 20 30 40 50

Number of data points in calibration set

Mea

n ga

ze p

oint

err

or a

cros

s al

l 100

dat

a po

ints

(in

ches

)

Figure 8 Error in computed point of gaze derived when model was calibrated with in-creasing numbers of calibration points across 100 random orderings of the data set collectedin the first experiment (using a laser pointer) Mean gaze point error was reduced to within2 in (about 2ordm) after about 23 directed gazes

768 BARABAS ET AL

distortion compensation calibration and actual experi-mental use this approach is more likely to produce goodresults In the first experiment our method provided animprovement over omitting orientation distortion com-pensation entirely since sensor alignment was similar incalibration and experiment It is likely however that dis-tortion still plays a role in gaze point errors In the sec-ond experiment we found even lower (than that reportedin the Results section) mean gaze point error when mag-netic tracker measurements were corrected for locationdistortion but not for orientation distortion As the mag-netic tracker was rotated on its side when attached to theheadgear of the head-mounted sight (Figure 2) it was nolonger in the relative orientation needed to see a benefitfrom our compensation for orientation distortion Ex-panding the distortion compensation calibration proce-dure to measure the sensor at multiple orientations foreach grid location sampled could further reduce errordue to distortion This change would increase both thenumber of data points required and the time needed toperform the initial data collection procedure but it wouldnot impact per-participant calibration demands

The geometric calibration algorithm seeks a tOumlB andtSumlE yielding a local minimum in gaze point error start-ing from the initial guess derived from hand measure-ments of these parameters Control experiments wereperformed to verify that no other potentially lower min-ima could be found near the minima arrived at by cali-

bration Although the minimization procedure failed toconverge for intentionally inconsistent initial guessestests conducted using parameters displaced from hand-measured values by several inches in all directions allstill converged on the same LoPG Thus uncorrecteddistortion in magnetic tracker measurements was mostlikely the leading cause of gaze point error

Comparison between points of gaze computed with(Figure 6D) and without (Figure 6C) the geometric cali-bration step shows that attempts to hand measure the spa-tial relationships tOumlB and tSumlE (ie without geometriccalibration) lead to relatively large gaze point errors Ap-plication of the geometric fitting procedure substantiallyimproves the ability of the model to compute LoPG Al-though it is not surprising that better results can be ob-tained when calibration is performed it is interesting tonote that the parameters resulting from calibration deviatedconsiderably from hand-measured parameters In factthe optimized parameters comprising tSumlE indicatedspatial relationships that were clearly inconsistent withthe actual configuration of the laser and magnetic sensorFor example parameters producing the least mean gazepoint error contained values for the distance between themagnetic sensor and the LoPG that were several inchesgreater than physical measurements indicated Althoughthere is some imprecision (we estimate about a quarter ofan inch) in measuring straight-line distance between eyeand magnetic sensor this imprecision alone cannot ex-

mean across all orderings95 confidence interval

14

12

10

8

6

4

2

02 10 20 30 40 50

Number of data points in calibration set

Mea

n ga

ze p

oint

err

or a

cros

s al

l 100

dat

a po

ints

(in

ches

)

Figure 9 Error in computed point of gaze derived when model was calibrated with in-creasing numbers of calibration points across 100 random orderings of the data set collectedin the second experiment (with a human participant) Mean gaze point error was reduced toless than 2 in (about 2ordm) in about 12 directed gazes

TRACKING THE LINE OF PRIMARY GAZE 769

plain the difference Since calibration performed withdata from the human participant seemed less prone tothis inconsistency we suspect that the constraints im-posed on the movement of the laser and sensor by the tri-pod head may be responsible In positioning the laser toalign it with calibration targets the sensor remainedwithin 47ordm of level and ldquorollrdquo of the sensor could not beperformed These constraints lead to less variation in thecalibration data and in turn may have underconstrainedthe fitting procedure used in geometric calibration

A limitation of the geometric calibration procedure de-scribed here is the inability of the calibration procedureto locate the eye along the LoPG Although this limita-tion does not impact the ability of the system to predictpoints of gaze on the screen it does impact the veracityof the model parameters Although the geometric modelused specifies the eye as a coordinate frame with an ori-gin located some distance from the magnetic tracker sen-sor the geometric calibration procedure constrains onlythe x-axis of this coordinate frame This results in an in-finite number of possible sets of model parameters thatstill predict the same point of gaze on the screen (iescreen intersection remains the same even if the eye wereto slide forward or back along the LoPG or to rotate aboutits x-axis) This means that the parameters making uptSumlE found by the fitting procedure correspond to onepossible location of the eye along the LoPG Control ex-periments showed that the fitted location of the eye alongthe LoPG depended on the initial guess used for the fit-ting procedure even though these alternate initialguesses still yielded the same points of gaze5 If the in-termediate results of the model are to be used for exam-ple for computing the location of the eye relative to thescreen for placement of a camera in a virtual environ-ment the inability of this system to precisely locate theeye might be overcome by performing an additional cal-ibration For example a calibration similar to the oneused in this article but requiring the participant to placethe eye in other nonprimary positions of gaze could beused to find actual eye location This technique is cur-rently being investigated

REFERENCES

Allison R S Eizenman M amp Cheung B S K (1996) Combinedhead and eye tracking system for dynamic testing of the vestibularsystem IEEE Transactions on Biomedical Engineering 43 1073-1082

Asai K Osawa N Takahashi H amp Sugimoto Y Y (2000 March)Eye mark pointer in immersive projection display Paper presented atthe IEEE Virtual Reality 2000 Conference New Brunswick NJ

Atchison D A amp Smith G (2000) Optics of the human eye OxfordButterworth-Heinemann

Barabas J Giorgi R G Goldstein R B Apfelbaum H WoodsR L amp Peli E (2003 May) Wide field 3D gaze tracking systemPaper presented at the Association for Research in Vision and Oph-thalmology Meeting Fort Lauderdale FL

Blood E (1990) US Patent No 4945305 Washington DC USPatent amp Trademark Office

Bryson S (1992 February) Measurement and calibration of staticdistortion of position data from 3D trackers Paper presented at SPIEStereoscopic Displays amp Applications III San Jose

Caldwell R T Berbaum K S amp Borah J (2000) Correcting er-rors in eye-position data arising from the distortion of magnetic fieldsby display devices Behavior Research Methods Instruments ampComputers 32 572-578

Day J S Dumas G A amp Murdoch D J (1998) Evaluation of along-range transmitter for use with a magnetic tracking device in mo-tion analysis Journal of Biomechanics 31 957-961

Duchowski A Medlin E Cournia N Murphy H Gramopad-hye A Nair S Vorah J amp Melloy B (2002) 3-D eye move-ment analysis Behavior Research Methods Instruments amp Comput-ers 34 573-591

Ikits M Brederson J D Hansen C D amp Hollerbach J M(2001 March) An improved calibration framework for electromag-netic tracking devices Paper presented at the Virtual Reality 2001Proceedings IEEE Yokohama

Janin A Mizell D amp Caudell T (1993 September) Calibrationof head-mounted displays for augmented reality applications Paperpresented at the Virtual Reality Annual International SymposiumSeattle

Kindratenko V (1999) Calibration of electromagnetic tracking de-vices Virtual Reality Research Development amp Applications 4139-150

Kindratenko V (2000) A survey of electromagnetic position trackercalibration techniques Virtual Reality Research Development ampApplications 5 169-182

King P (1995) Integration of helmet-mounted displays into tacticalaircraft Proceedings of the Society for Information Display 26 663-668

Kuipers J B (1999) Quaternions and rotation sequences A primerwith applications to orbits aerospace and virtual reality PrincetonNJ Princeton University Press

Leigh R J amp Zee D S (1983) The neurology of eye movementsPhiladelphia Davis

Livingston M A amp State A (1997) Magnetic tracker calibrationfor improved augmented reality registration Presence Teleoperatorsamp Virtual Environments 6 532-546

Nixon M A McCallum B C Fright W R amp Price N B (1998)The effects of metals and interfering fields on electromagnetic track-ers Presence 7 204-218

Southard D A (1995) Viewing model for virtual environment dis-plays Journal of Electronic Imaging 4 413-420

Young L R amp Sheena D (1975) Survey of eye movement recordingmethods Behavior Research Methods amp Instrumentation 7 397-429

NOTES

1 This position is used in projected virtual reality displays to deter-mine how to position the virtual camera when generating an image of ascene

2 Even if the tracker is not aligned exactly with the case of the trans-mitter the correction polynomials (Equations 1 and 2) we used includeterms to account for such an offset These offsets are contained in theconstant (i 0) and linear (i 1) terms of polynomials

3 Two linear systems were constructed from 192 six-variable mea-surements Each of the two systems contained 192 3 576 knownsand 105 unknowns

4 It should be noted that the calibration procedure described heredoes not completely constrain the model parameters used Since no mea-surement of orientation on the screen of the point of gaze or distancefrom eye to screen during a sighting is made neither of these can be pre-dicted by the model As a result the best-fitting model parameters de-scribing tSumlE may converge on any one of a line of solutions corre-sponding to possible locations of the eye along the LoPG Each of theseeye locations predicts the same LoPG location on the screen It shouldbe possible to compute correct eye position with a calibration proce-dure that includes eye-in-head rotation

5 For this reason we have chosen to report errors in inches along thescreen surface since angular measures of error are more sensitive to lo-cation of the eye along the LoPG Where angular error is reported weestimated the eyendashscreen distance to be the distance between the mag-netic sensor and the screen

770 BARABAS ET AL

ARCHIVED MATERIALS

The following materials associated with this article are retrievablefrom the Psychonomic Societyrsquos Norms Stimuli and Data archivehttpwwwpsychonomicorgarchive

To access the above files or links search the archive for this articleusing the journal (Behavior Research Methods Instruments amp Com-puters) the first authorrsquos name (Barabas) and the publication year(2004)

FILE Barabas-BRMIC-2004zipDESCRIPTION The compressed archive file contains seven files The

files areREADMEtxt containing a description of scripts and data included

in the archive (a 3K text file) This file can renamed to readmem andexecuted as a Matlab script to reproduce some of the results reported inthis manuscript

barabas2004LoPGmat a Matlab workspace file containing mag-netic tracker data collected in the verification experiments described inthis manuscript (a 32K binary file)

findcorrectioncoeffsm a Matlab function for finding the distortioncompensation coefficients for Equations 1 and 2 (a 5K Matlab M-file)

correctdistortionm a Matlab function for performing distortioncompensation on magnetic tracker data using Equations 1 and 2 (a 4KMatlab M-file)

findmodelparamsm a Matlab function for finding the parameters ofthe LoPG model from data collected during a series of directed gazes (a3K Matlab M-file)

intersecterrorm a Matlab function for finding gaze point error for aset of directed gazes (a 5K Matlab M-file)

trackereulerstomtx4m a Matlab function that computes a transfor-mation matrix representation of a relationship between coordinateframes via Equation 4 (a 3K Matlab M-file)

AUTHORrsquoS WEB SITE httpwwweriharvardedufacultypeliindexhtml

AUTHORrsquoS E-MAIL ADDRESS barabasmitedu

(Manuscript received July 18 2003revision accepted for publication July 23 2004)

TRACKING THE LINE OF PRIMARY GAZE 759

In this article we first will describe our algorithms forcompensating for magnetic distortion and for finding theLoPG from compensated tracker measurements Thenwe will describe our calibration methodology for findingparameters for the above procedures Finally we will de-scribe and present the results of two experiments used tovalidate our method

METHOD

ApparatusThe LoPG tracking system we used to validate our

tracking algorithm and calibration techniques was builtas a part of our walking simulator It consisted of the fol-lowing

Projection screen A 67 50 in rear projection screenwas used for presentation of calibration targets The bot-tom of the screen was elevated 30 in above our walkingplatform (see Figure 1) to place its center near eye levelfor walking participants

Magnetic tracker An Ascension Flock-of-Birds mag-netic tracker with an extended range transmitter (AscensionTechnology Corporation Burlington VT) was used tomonitor head position This system reports location andorientation of a small magnetic sensor in relation to atransmitter Each measurement consists of six variablesthree representing location in three dimensions and threeEuler angles representing orientation The tracker trans-mitter was mounted on a 48-in-tall wood stand and wasplaced 52 in to the right and 36 in in front of the pro-jection screen center The tracker sensor was mounted onadjustable headgear worn by the participant (Figure 2)The headgear also carried the head-mounted sight

Head-mounted sight A head-mounted sight wasused to allow the participants to consistently direct theirLoPG at projected screen targets The sight was a 18-inhollow square mark on a clear plastic panel which wasin turn mounted on a 10-in lightweight brass boom ex-tending from the headgear

Computer A Pentium computer running MicrosoftWindows 2000 Matlab 65 (The MathWorks Inc NatickMA) and custom software was used to generate screenimages of calibration targets to control and record datafrom the magnetic tracker and to perform calibrationand gaze computations Calibration software was writ-ten using Matlab and is available for download from thePsychonomic Society Web archive

LoPG Tracking AlgorithmWe developed an algorithm to transform raw magnetic

tracker measurements into experimental variables of in-terest for behavior research The algorithm consists ofthree steps distortion compensation geometric trans-formation and computation of the point of gaze Distor-tion compensation consists of correcting raw magnetictracker readings to compensate for the spatial distortionsthat are characteristic of these devices In this step weattempt to compute the true location and orientation of

the magnetic sensor from its reported location and ori-entation In the second step (geometric transformation)distortion-compensated measurements are used to findthe LoPG relative to the projection screen Finally theLoPG is used to compute the point on the screen alignedwith the sighting mark We call this on-screen locationthe point of gaze The first and second steps of this al-gorithm rely on parameters that are obtained through apair of calibration processes described below in the Cal-ibration section

Distortion compensation Examination of readingsfrom our magnetic tracker confirmed other research(Bryson 1992) that has shown that distortion in both thelocation and the orientation components of measure-ments varied dramatically depending on location of themagnetic sensor To account for these distortions we usea compensation function that adds a correction factor tomeasurements This correction depends on the locationportion of magnetic sensor readings Specifically tocompensate for distortion the translational componentsof each raw measurement from the magnetic tracker (xoyo and zo representing the location in three dimensionsof the magnetic sensor relative to the transmitter) aretransformed with a polynomial distortion correctionfunction (Kindratenko 1999) The correction consists ofthree degree-n polynomials in three variables Each ofthese three polynomials contains terms consisting of theproduct of xo yo and zo raised to powers in all combi-nations where the sum of the exponents is n or fewer

(1)

where aijk bijk and cijk are t (n1)(n2)(n3)6 poly-nomial coefficients (obtained in the calibration process)and xc yc and zc consist of the coordinates of the cor-rected measurement Orientation of the sensor reportedby the magnetic tracker as Euler angles qo fo and yo iscorrected using a similar set of polynomials

(2)

where dijk eijk and fijk are t polynomial coefficients andqc fc and yc are corrected orientation angles Correct-ing for distortion in this way assumes that the distortionin orientation measurements depends only on the loca-tion of the magnetic sensor and not at all on orientationAlthough distortion in orientation measurements doesindeed vary with sensor orientation (Livingston amp State1997) this form of distortion compensation is still ef-fective if the sensor remains in the same relative orien-tation in both calibration and experimental use Polyno-mial coefficients in Equations 1 and 2 correspond to theshape of the specific distortion in a given experimental

qf

y

qf

y

c

c

c

o

o

o

o o o

Egrave

Icirc

IacuteIacuteIacute

˘

˚

˙˙˙

=Egrave

Icirc

IacuteIacuteIacute

˘

˚

˙˙˙

+Egrave

Icirc

IacuteIacuteIacute

˘

˚

˙˙˙

loz loz - -

=

-

==AcircAcircAcirc

d

e

f

x y zijk

ijk

ijk

j k i j k

k

i j

j

i

i

n

000

x

y

z

x

y

z

a

b

c

x y zijk

ijk

ijk

j k i j k

k

i j

j

i

i

nc

c

c

o

o

o

o o o

Egrave

Icirc

IacuteIacuteIacute

˘

˚

˙˙˙

=Egrave

Icirc

IacuteIacuteIacute

˘

˚

˙˙˙

+Egrave

Icirc

IacuteIacuteIacute

˘

˚

˙˙˙

loz loz - -

=

-

==AcircAcircAcirc

000

760 BARABAS ET AL

configuration and can be computed using the calibrationprocedure described below In agreement with the find-ings of Kindratenko (2000) we found that a polynomialof degree n 4 modeled the shape of the distortion wellA fourth-degree polynomial required t 35 coefficientsfor each tracker variable (210 in total)

Geometric transformation The second computa-tional step of the LoPG tracking algorithm transformsthe distortion-compensated magnetic tracker readings tolocate LoPG in relation to the screen This transforma-tion employs a geometric model of the spatial relation-ship between the eye and the magnetic sensor and of therelationship between the screen and the magnetic trans-mitter (Figure 3) The set of 12 model parameters com-puted in the geometric calibration procedure (describedbelow) characterizes these two spatial relationships

The model takes each of the following locations withinthe tracking environment to be the origin of a three-dimensional coordinate frame the upper left corner ofthe screen (coordinate frame O) the measurement originof the magnetic transmitter (B) the measurement centerof the magnetic sensor mounted on the head (S) and the

center of rotation of the eye (E) The x y and z axes ofeach coordinate frame are oriented as shown in Figure 3

We represent the spatial relationships between pairs ofthese coordinate frames as 4 4 homogeneous trans-formation matrices Each such matrix can describe therelative locations and orientations of two coordinateframes and allows measurements in one frame to betransformed to the other by matrix multiplication Foreach such matrix

(3)

representing the transformation from some frame a to someframe b xba yba and zba describe the three-dimensionallocation of the origin of a in the coordinate system of bThe r variables are the lengths of the projections of unitvectors along each of the three coordinate axes of framea onto the axes of frame b These projections form therotational component of the transformation

tb auml

=

Egrave

Icirc

IacuteIacuteIacuteIacute

˘

˚

˙˙˙˙

r r r x

r r r y

r r r z

ba

ba

ba

11 12 13

21 22 23

31 32 33

0 0 0 1

Figure 3 Geometric model for finding line of primary gaze by combining (via matrix multiplication) thespatial relationship measured by the magnetic tracker (ttBumlS corrected for magnetic distortion) with thosederived through geometric calibration (ttOumlB and ttSumlE) B is the tracker transmitter S is the tracker sen-sor E is the eye and O is the projection screen

TRACKING THE LINE OF PRIMARY GAZE 761

In addition to the matrix representation of the relation-ship between two coordinate frames it is often conve-nient to use a translation and Euler angle representationIn this way a transformation can be described compactlywith 6 variables instead of the 12 used in the matrix formThe three coordinates of translation and the three Eulerangles that specify a transformation Mbumla (xba yba zbaqba fba yba) can be converted to matrix form by a se-ries of three rotations and a displacement tbumla as shownin Equation 4 at the bottom of the page (Kuipers 1999)

Using these conversions the (corrected) measure-ments provided by the magnetic tracker can be used toderive the other spatial relationships described in Fig-ure 3 The computations of the various transformationsare described next followed by a description of the cal-ibration process

Since the LoPG lies along the x-axis of the eye frameE to find the LoPG relative to the screen we find the lo-cation and orientation of this frame with respect to thescreen frame O To find this relationship we composethe matrix representations of the spatial relationshipsmeasured by the magnetic tracker with matrices com-puted through calibration The position of the magnetictransmitter with respect to the screen tOumlB (transforma-tion from transmitter frame to screen frame) and that ofthe eye with respect to the sensor tSumlE remain fixed forthe duration of an experiment The transformation tBumlSis constructed using Equation 4 from the six variablesMBumlS (xc yc zc qc fc yc) resulting from the distortioncompensation step (Equations 1 and 2) The coordinatesxc yc and zc represent the corrected three-dimensionaldisplacement between the origins of B and S while qcfc and yc are the corrected Euler angles describing theorientation of S in the coordinate system of B

Since composite transformations can be representedas the product of transformation matrices we can findthe location and orientation of the eye relative to thescreen tOumlE as follows

(5)

Once tOumlE is computed we can find both the eye loca-tion with respect to the screen1 (simply the last columnof tOumlE) and the point of intersection between the LoPGand the screen

Point of gaze computation The point on the screenthat intersects the participantrsquos LoPG can be found fromtOumlE the result of the geometric transformation portionof the gaze-tracking algorithm In our model the LoPGfalls along the x-axis of the eye coordinate frame E Wesolve for a point that is on the x-axis of the eye frame and

also falls in the xndashy plane of the screen coordinate frameOmdashthe surface of the screen This intersection can beformulated as a linear system consisting of a point on thex-axis of frame E (xo 0 0) transformed by tOumlE and setequal to a point in the xndashy plane of frame O (Intx Inty 0)

(6)

Solving this system for Intx and Inty gives us the inter-section between the LoPG and the plane of the screenexpressed in the screen coordinate frame

CalibrationParameters used in the LoPG tracking algorithm are

computed via two calibration steps First the distortionin magnetic tracker readings is measured and coeffi-cients for the compensation polynomials (Equations 1and 2) are computed This distortion compensation cal-ibration process was carried out when our tracking sys-tem was installed Computed coefficients can be reusedassuming that there have been no changes to the relativeplacement of the tracker transmitter and the distortion-causing (large metal) objects The second geometric cal-ibration step computes the spatial relationship betweenthe magnetic tracker sensor and the eye along with therelationship between the tracker transmitter and the screenimage This second step is performed once for each par-ticipant before an experimental run since an adjustment ofthe headgear or of the image projected on the screen al-ters the spatial relationships measured in this calibration

Distortion compensation calibration To computecoefficients aijk bijk cijk dijk eijk and fijk of the distortioncompensation polynomials (Equations 1 and 2) we com-pare a set of tracker readings taken over a grid of knownpoints within the tracking volume to the hand-measuredlocations of those points The differences between themeasured and the reported locations form a set of vectorsmapping points in reported space to actual physical lo-cations (these difference vectors were small near thetransmitter and became largemdashsee Figure 5mdashnear theouter third of the nominal volume of our trackerrsquos range)We then solve for the set of polynomial coefficients thatwhen used in Equation 1 minimizes the sum of the squaredlengths of these error vectors An identical procedure isperformed on orientation data finding coefficients thatminimize the sum of the squared errors in Euler anglesreported by the tracker

tO E

o

uml

Egrave

Icirc

IacuteIacuteIacuteIacute

˘

˚

˙˙˙˙

=

Egrave

Icirc

IacuteIacuteIacuteIacute

˘

˚

˙˙˙˙

x Int

Intx

y0

0

1

0

1

O E O B B S S Euml uml uml uml=t t t t

(4)b auml

=

Egrave

Icirc

IacuteIacuteIacuteIacute

˘

˚

˙˙˙˙

-Egrave

Icirc

IacuteIacuteIacuteIacute

˘

˚

˙˙˙˙

-

Egrave

Icirc

IacuteIacuteIacuteIacute

˘

˚

˙t

q qq q

f f

f f

1 0 0

0 1 0

0 0 1

0 0 0 1

0 0

0 0

0 0 1 0

0 0 0 1

0 0

0 1 0 0

0 0

0 0 0 1

x

y

z

ba

ba

ba

ba ba

ba ba

ba ba

ba ba

cos( ) sin( )

sin( ) cos( )

cos( ) sin( )

sin( ) cos( )˙˙˙

-Egrave

Icirc

IacuteIacuteIacuteIacute

˘

˚

˙˙˙˙

1 0 0 0

0 0

0 0

0 0 0 1

cos( ) sin( )

sin( ) cos( )

y yy y

ba ba

ba ba

762 BARABAS ET AL

In our walking simulator environment we constructeda system of pegboards and stands to provide a set ofknown locations for placement of the magnetic sensorduring this procedure The pegboard system illustratedin Figure 4 was constructed from 24 48 in sheets ofpredrilled 025-in-thick pegboard mounted on a woodenframe Screws used in the construction of the frame werereplaced with glue since we found them to be causingsmall local distortions in magnetic tracker readings Thesensor stands were constructed from sections of 4-in-diameter PVC pipe with flat pegboard caps on each endEach stand held the magnetic tracker sensor rigidly onone end cap and had pegs for attachment to the pegboardon the other The sensor was positioned so that when thestands were engaged with the pegboard the sensorrsquosx-axis (see Figure 4) pointed toward the screen and itsz-axis pointed toward the floor In this orientation themagnetic tracker reported azimuth elevation and roll ofthe sensor (qo fo and yo) to all be near zero (these an-gles deviated from zero by a few degrees due to distor-tion) Several pipe lengths were used for measurementsat varying heights (0 6 12 and 18 in) above the peg-board plane Other devices for placing a sensor at knownlocations have also been developed including a fixturemade entirely of Plexiglas (Caldwell et al 2000) and anoptical tracking system (Ikits Brederson Hansen ampHollerbach 2001)

The pegboard frame was placed in our tracking envi-ronment and was mechanically squared with the case ofthe magnetic transmitter The location of the pegboard

grid with respect to the magnetic transmitter was thenmeasured2 Readings from the magnetic tracker weretaken over both a sparse three-dimensional grid of 64locations spanning 45 52 18 in (see Figure 5) anda denser smaller grid of 128 locations covering the cen-tral 32 40 18 in of the same space The magnetictracker readings from both grids were combined to formthe data set of reported locations and orientations UsingMatlab we performed linear least-squares fits on the dif-ferences between reported and pegboard-measured datasets to find the 210 coefficients for the distortion cor-rection polynomials3 An illustration of the magnitude oflocation distortion for our tracker is provided in Figure 5

Geometric calibration The second calibration stepcomputes the geometric model parameters describingspatial relationships tOumlB and tSumlE These parameterscorresponding to the alignment of the screen with thetracker transmitter (tOumlB) and position of the magnetictracker sensor relative to the eye (tSumlE) are difficult tomeasure directly and tSumlE changes from participant toparticipant or whenever the headgear is adjusted Forthese reasons we developed a calibration procedure toquickly ascertain these model parameters

Geometric calibration of the LoPG-tracking system isperformed by taking magnetic tracker recordings whileparticipants perform a series of constrained gazes Foreach gaze the participant using his or her left eye alignsthe head-mounted sight with a target dot projected on thescreen The participant repeats this procedure from a se-ries of standing locations within the desired usage space

Figure 4 Pegboard apparatus for measuring distortion in magnetic tracker measure-ments Two such pegboard panels were mounted on a wooden frame to calibrate a volumemeasuring 45 52 18 in Stands of various heights were placed at different locations acrossthe base pegboard to provide wide coverage of the volume

TRACKING THE LINE OF PRIMARY GAZE 763

After a number of such gazes a pair of tOumlB and tSumlEparameters is computed by the GaussndashNewton nonlinearleast-squares fitting method as will be described in thenext section

Geometric calibration algorithm The fitting proce-dure seeks the set of model parameters that when com-bined with the geometric model in our LoPG-trackingalgorithm best predict on-screen points of gaze fromdistortion-compensated magnetic tracker readings takenover a set of directed gazes Our model computes Int thetwo-dimensional on-screen point of gaze expressed as

Int F(P MBumlS) (7)

where F is the function that computes point of gaze froma given (corrected) magnetic tracker reading and a set ofmodel parameters (Equations 5 and 6) P is the set of 12model parameters characterizing via Equation 4 thespatial relationships tOumlB and tSumlE (the 12 parametersdescribe for each of the two transformations a three-dimensional displacement between coordinate framesand three Euler angles describing relative orientations ofthe coordinate frames) MBumlS is the magnetic trackermeasurement (three distances and three angles) cor-

rected for distortion by Equations 1 and 2 We assessquality of the fit found by the geometric calibration pro-cedure by the size of the average error across all gazes ina calibration This mean gaze point error is expressed asthe sum across all calibration gazes of the distancesalong the screen between the actual calibration targetsand the model-predicted points of gaze The fitting pro-cedure seeks model parameters P that minimize thissum given a set of magnetic tracker measurements MBumlSa set of corresponding screen intersections Int and aninitial ldquoguessrdquo for P Starting with the given guess for Pthe fitting procedure iteratively adjusts these parametersuntil a local minimum in mean gaze point error is found4Minimization is performed in Matlab using the nlinfitfunction (from the statistics toolbox an implementationof the GaussndashNewton fitting method) In our implemen-tation this technique finds a P corresponding to a givenset of 20 measurements of Int and MBumlS in a few sec-onds on a 700-MHz PC

Experimental ValidationTo test the calibration and LoPG-tracking techniques

presented in this article we first calibrated the distortion

y coordinate of magnetic transmitter frame (inches)x coordinate of magnetic transmitter frame (inches)

z coordinate of magnetic transm

itter frame

Actual shape ofmeasured space

Reported shape of measuredspace due to distortion

Projection

Screen

ndash90ndash80

ndash70ndash60

ndash50ndash40

ndash30ndash20

ndash100

1040

30

20

10

0

ndash10

ndash20

ndash30ndash40

ndash30

ndash20

ndash10

0

10

20

Magnetic TrackerTransmitter

Figure 5 Distortion measured in performing distortion compensation calibration of magnetic trackerAxis units are inches in the coordinate system of the magnetic tracker transmitter Dark bars on axes in-dicate the ranges of physical locations sampled in the calibration procedure Shade of surface indicatesmagnitude of distortion in location measurements from the magnetic tracker The darkest areas indicatedistortion in location measurement of over 13 in

764 BARABAS ET AL

compensation system as described above and then per-formed two target-sighting experiments In the first ex-periment we performed the geometric calibration pro-cedure using the beam of a laser pointer to represent theline of gaze of the human participant Doing so allowedthe simulated LoPG to be precisely aligned with calibra-tion targets projected on the screen and allowed testingwithout the variability of human participantsrsquo ability tomaintain head position To fix the relative placement ofthe laser and sensor both were attached to a small woodenblock The sensor was placed on the block with its x-axisapproximately parallel to and 2 in above the laser beamThis alignment enabled us to easily measure the actualspatial relationship between the sensor and the laser forvalidation of the parameters found by the geometric cali-bration procedure (We found it much more difficult tomeasure the equivalent relationship tSumlE on a humanparticipant) The wood block holding the sensor and laserwas fastened to an aluminum and plastic tripod raised tothe approximate height of a human participantrsquos head Togather data for testing the geometric calibration proce-dure the tripod was placed at 10 different locations Theselocations were chosen within the range of possible stand-ing positions of our study participants (an area about 36 36 in) For each of the 10 tripod locations readings fromthe magnetic tracker were taken panning and tilting thetripod head so that the laser beam fell on each calibrationtarget Four of the targets were just inside the corners ofthe screen one was placed in the center and five other lo-cations were arbitrarily chosen The targets remained inthe same locations throughout each experiment Due tothe distance between the sensor and the pivot point of thetripod head for a given tripod location the actual magneticsensor locations varied by up to 8 in as the tripod headwas tilted at different angles to align the laser

In a second experiment a similar set of 100 directedgazes was performed by a human participant The par-ticipant used the head-mounted sight (Figure 2) to alignhis LoPG with each of the 10 targets from each of 10standing locations The 5 calibration targets that were ar-bitrarily placed in the first experiment were in differentlocations for calibration of the human participant

Data from both experiments were independently usedto calibrate the LoPG-tracking system and the benefitof distortion compensation and geometric calibrationprocedures were examined For the distortion compen-sation step we used polynomial coefficients computedfrom the 192 pegboard locations as described above Inall cases we used as an initial guess for the geometriccalibration procedure a set of hand-measured parametersthat approximated the actual geometry of the system

RESULTS

For each of the two experiments collected data con-sisted of a set of 100 six-variable tracker readings (eachreading taken when the laserLoPG was aligned to fall atthe center of an on-screen calibration target) as well as

the 100 two-dimensional screen locations of the corre-sponding calibration targets (10 repetitions of 10 tar-gets) Location and orientation data from the magnetictracker were passed through the distortion compensationfunctions (Equations 1 and 2) yielding a second set of100 corrected tracker readings for each experiment Weused subsets of this body of distortion-corrected data toderive the transformation matrices tOumlB and tSumlE usingthe geometric calibration procedure described aboveand tested these fits against the remaining data We com-pared known screenndashtarget locations to point-of-gazecomputations made with and without distortion com-pensation or geometric calibration We also investigatedthe behavior of the error as a function of the number ofincluded calibration points by varying the size of the cal-ibration data sets used to compute tOumlB and tSumlE

Accuracy of Predicted Point of GazeUsing the distortion-corrected location and orientation

measurements from the entire set of 100 laserndashtargetalignments from the first experiment and the geometriccalibration procedure the tOumlB and tSumlE matrices werecomputed Equations 5 and 6 were then used to predictpoint of gaze for each of the 100 laserndashtarget alignmentsand mean gaze point error (mean distance along the screenbetween calibration target and point of gaze found withthe LoPG-tracking algorithm) was computed For thedata set from the first experiment mean error was 150 in(Figure 6D) Error can also be described as the angle be-tween the screen target and the computed point of gazewhen viewed from the location of the magnetic sensorMeasured in this way mean angular error was 155ordm Forreasons described in the Discussion section we will re-port the remainder of our mean error results in inches Atour screenndasheye distances 1ordm corresponds approximatelyto 1 in of error

To examine the effect of distortion compensation onmean gaze point error we also performed the geometriccalibration procedure using raw magnetic tracker mea-surements not processed by the distortion correctionpolynomials As seen in Figure 6B without distortioncompensation mean gaze point error was significantlylarger than with compensation (Wilcoxon signed rankstest Z 366 p 001) at 193 in when calibratedusing data from the same experiment (an increase of 29)Without distortion compensation clear outliers from theclusters of points of gaze can be seen for most of thescreen targets in Figure 6B These outlying points ofgaze were all from the tripod location most distant fromthe magnetic transmitter where distortion was largestDistortion compensation provided most benefit for thistripod location

To examine the effect of the geometric calibration pro-cedure on accuracy of predicted LoPG we also com-puted mean gaze point error across the collected data setusing only a hand-measured estimate of tOumlB and tSumlEDistortion correction was included but no geometriccalibration was performed As one might expect without

TRACKING THE LINE OF PRIMARY GAZE 765

calibration errors were larger and this difference wassignificant (Wilcoxon signed ranks test Z 809 p 001) Mean gaze point error using an uncalibrated modelwas 396 in as can be seen in Figure 6C

We also examined the ability of the geometric model tocompute points of gaze without both distortion compensa-tion and geometric calibration The hand-measured tOumlBand tSumlE and raw magnetic tracker readings were used inEquations 5 and 6 Mean gaze point error was 566 in (Fig-ure 6A) The addition of either geometric calibration ordistortion compensation yielded a significant reduction inerror (Wilcoxon signed ranks test Z 61 p 001)

Conducting the same analysis for the data from thesecond experiment we found mean gaze point error tobe lower than that in the first experiment when geomet-ric calibration was performed When both distortioncompensation and geometric calibration were used meanerror was 072 in When geometric calibration was per-formed on data not corrected for distortion mean error was106 in (Wilcoxon signed rank test Z 520 p 001)Reasons for the reduced errors in the second experiment

are proposed in the Discussion section Without geometriccalibration error was much larger (Wilcoxon signed ranktest Z 81 p 001) since we were unable to measuretSumlE with the same accuracy for the second experimentWith and without distortion correction mean errors were398 and 426 in respectively

Effect of sensor location on error Using the rawdata (not compensated for distortion) from the first ex-periment and computing point of gaze without perform-ing geometric calibration the farther the sensor wasfrom the transmitter the worse the error (Spearman cor-relation r 46 p 001) and that error was manifestas an increasing error in the y-dimension (Spearman cor-relation r 50 p 001 Figure 7A) When both thedistortion correction and the geometric calibration wereimplemented the errors were more evenly distributed(Figure 7B) resulting in no significant correlation withdistance from the transmitter (Spearman correlation r 04 p 71) There was however a significant tendencyfor the errors to increase toward the screen (ie in thex-dimension Spearman correlation r 22 p 003)

Figure 6 Effects of distortion compensation (panels C and D) and geometric calibration (panels B and D) oncomputation of points of gaze Data were collected with a tripod-mounted laser pointer used to simulate line ofprimary gaze (LoPG) Circular marks indicate screen location of calibration targets Crosses are laser screen in-tersections (points of gaze) computed by LoPG-tracking algorithm Within each cluster each cross represents ameasurement from a different tripod location Rectangular outline shows location of projection screen Screenoutline and targets appear at hand-measured location (panels A and C) and location found by geometric cali-bration (panels B and D)

766 BARABAS ET AL

This may be because for tripod locations closer to thescreen an angular error (due to distortion for example)of the same size results in a larger error distance alongthe screen

Short calibration In order to use our LoPG-trackingsystem as a tool for gathering data from untrained studyparticipants it is advantageous to calibrate the systemusing as few data points as necessary (eg to reduce fa-tigue) To find the number of calibration data pointsneeded to accurately predict point of gaze we createdcalibration sets from subsets of the 100 measurementscollected in each experiment We used these small cali-bration sets to compute tOumlB and tSumlE and then usedthose computed parameters to calculate screen intersec-tions across the full set of 100 measurements We thencompared these computed intersections with the loca-tions of the actual locations of the on-screen targets Weperformed this technique for calibration sets of varyingsizes across 100 random orderings of the 100 data pointscollected in each experiment For the first experimentwe found that after about 20 points the geometric cali-

bration system converges to a point of gaze with a meangaze point error of less than 2 in with only small im-provements attained by including additional points in thecalibration set (Figure 8)

For the second experiment overall gaze point errorwas lower when calibration was performed with the en-tire data set For this data set 2 in of error could bereached with about 12 data points (Figure 9) Calibra-tions performed with more than 15 data points providedonly a small additional benefit

In the processing of calibration data from both exper-iments geometric calibration occasionally failed com-pletely producing larger mean gaze point error than ifgeometric calibration had not been performed at allThese failures occurred with very small calibration setswhen distortion-related errors provided inconsistent con-straints for the model Failures also occurred when setsincluded targets from only one part of the screen insuf-ficiently constraining the model Fitting failures neveroccurred with sets of 10 points or more and never occurredwhen distortion compensation was performed In all the

Figure 7 The mean gaze point error at each of the 10 tripod locations on the left using the raw data from themagnetic tracker collected in the first experiment (see Figure 6A) and on the right using the same data correctedusing both the distortion compensation and the geometric calibration (Figure 6D) The tripod and laser pointerwere directed toward the projection screen which is shown as the tall white rectangle on the right of each panelThe tracker transmitter is shown as the black square Each circle represents one of the tripod locations with ra-dius equal to the mean gaze point error from that location Mean gaze point error for each location is also shownwith the corresponding circle

TRACKING THE LINE OF PRIMARY GAZE 767

cases in which calibration failed the addition of one ortwo calibration points to the set was enough to reversethe failure In general better fits were obtained from cal-ibration sets with greater variety in sampled tracker ori-entations and screen locations

DISCUSSION

The LoPG-tracking model and calibration techniquespresented here succeed at accurately predicting the on-screen point of gaze while participants look through ahead-mounted sight The validation experiments showthat in agreement with Caldwell et al (2000) the use ofpolynomial distortion compensation can improve the ac-curacy of gaze point calculations (Figures 6C and 6D)In addition benefit was also seen from geometric cali-bration The techniques presented for tracking LoPGprovide a foundation for future gaze-tracking systems

An interesting result of performing calibration bothwith and without distortion compensation was that al-though point of gaze was predicted more accurately withdistortion compensation the tracking system still per-formed relatively well in the presence of uncompensateddistortion provided that geometric calibration was per-formed In the presence of measurement errors causedby distortion the minimization procedure gives modelparameters that may not correspond well to the actualspatial relationships being modeled but interact with the

measurement distortion to produce reasonably accuratescreen intersection predictions If somewhat larger LoPG-tracking errors can be tolerated one might be able toachieve reasonable results by omitting distortion compen-sation altogether Omission of distortion compensationwould likely result in much greater error however for headpositions or orientations outside the range of calibration

Possible sources of gaze point errors remaining whenboth geometric calibration and distortion compensationwere performed include measurement noise inaccura-cies in magnetic tracker measurements not corrected bythe distortion compensation system inaccuracy of themodel parameters found by the geometric calibrationsystem or some combination of these Tests of stabilityin the tracker measurements showed that readings fromthe tracker taken milliseconds or hours apart agreedboth differ from reference measurements by only a fewhundredths of an inch and by a few tenths of a degreeThis small measurement noise is clearly not enough toexplain the gaze point errors which are orders of mag-nitude greater

Our method makes the simplifying assumption thatthe distortions in magnetic tracker orientation readingsdepend on location of the magnetic sensor only and noton the interaction between sensor location and orienta-tion Therefore it is likely that our method did not com-pletely correct for the magnetic distortion When themagnetic sensor is kept in a similar orientation in both

mean across all orderings95 confidence interval

14

12

10

8

6

4

2

02 10 20 30 40 50

Number of data points in calibration set

Mea

n ga

ze p

oint

err

or a

cros

s al

l 100

dat

a po

ints

(in

ches

)

Figure 8 Error in computed point of gaze derived when model was calibrated with in-creasing numbers of calibration points across 100 random orderings of the data set collectedin the first experiment (using a laser pointer) Mean gaze point error was reduced to within2 in (about 2ordm) after about 23 directed gazes

768 BARABAS ET AL

distortion compensation calibration and actual experi-mental use this approach is more likely to produce goodresults In the first experiment our method provided animprovement over omitting orientation distortion com-pensation entirely since sensor alignment was similar incalibration and experiment It is likely however that dis-tortion still plays a role in gaze point errors In the sec-ond experiment we found even lower (than that reportedin the Results section) mean gaze point error when mag-netic tracker measurements were corrected for locationdistortion but not for orientation distortion As the mag-netic tracker was rotated on its side when attached to theheadgear of the head-mounted sight (Figure 2) it was nolonger in the relative orientation needed to see a benefitfrom our compensation for orientation distortion Ex-panding the distortion compensation calibration proce-dure to measure the sensor at multiple orientations foreach grid location sampled could further reduce errordue to distortion This change would increase both thenumber of data points required and the time needed toperform the initial data collection procedure but it wouldnot impact per-participant calibration demands

The geometric calibration algorithm seeks a tOumlB andtSumlE yielding a local minimum in gaze point error start-ing from the initial guess derived from hand measure-ments of these parameters Control experiments wereperformed to verify that no other potentially lower min-ima could be found near the minima arrived at by cali-

bration Although the minimization procedure failed toconverge for intentionally inconsistent initial guessestests conducted using parameters displaced from hand-measured values by several inches in all directions allstill converged on the same LoPG Thus uncorrecteddistortion in magnetic tracker measurements was mostlikely the leading cause of gaze point error

Comparison between points of gaze computed with(Figure 6D) and without (Figure 6C) the geometric cali-bration step shows that attempts to hand measure the spa-tial relationships tOumlB and tSumlE (ie without geometriccalibration) lead to relatively large gaze point errors Ap-plication of the geometric fitting procedure substantiallyimproves the ability of the model to compute LoPG Al-though it is not surprising that better results can be ob-tained when calibration is performed it is interesting tonote that the parameters resulting from calibration deviatedconsiderably from hand-measured parameters In factthe optimized parameters comprising tSumlE indicatedspatial relationships that were clearly inconsistent withthe actual configuration of the laser and magnetic sensorFor example parameters producing the least mean gazepoint error contained values for the distance between themagnetic sensor and the LoPG that were several inchesgreater than physical measurements indicated Althoughthere is some imprecision (we estimate about a quarter ofan inch) in measuring straight-line distance between eyeand magnetic sensor this imprecision alone cannot ex-

mean across all orderings95 confidence interval

14

12

10

8

6

4

2

02 10 20 30 40 50

Number of data points in calibration set

Mea

n ga

ze p

oint

err

or a

cros

s al

l 100

dat

a po

ints

(in

ches

)

Figure 9 Error in computed point of gaze derived when model was calibrated with in-creasing numbers of calibration points across 100 random orderings of the data set collectedin the second experiment (with a human participant) Mean gaze point error was reduced toless than 2 in (about 2ordm) in about 12 directed gazes

TRACKING THE LINE OF PRIMARY GAZE 769

plain the difference Since calibration performed withdata from the human participant seemed less prone tothis inconsistency we suspect that the constraints im-posed on the movement of the laser and sensor by the tri-pod head may be responsible In positioning the laser toalign it with calibration targets the sensor remainedwithin 47ordm of level and ldquorollrdquo of the sensor could not beperformed These constraints lead to less variation in thecalibration data and in turn may have underconstrainedthe fitting procedure used in geometric calibration

A limitation of the geometric calibration procedure de-scribed here is the inability of the calibration procedureto locate the eye along the LoPG Although this limita-tion does not impact the ability of the system to predictpoints of gaze on the screen it does impact the veracityof the model parameters Although the geometric modelused specifies the eye as a coordinate frame with an ori-gin located some distance from the magnetic tracker sen-sor the geometric calibration procedure constrains onlythe x-axis of this coordinate frame This results in an in-finite number of possible sets of model parameters thatstill predict the same point of gaze on the screen (iescreen intersection remains the same even if the eye wereto slide forward or back along the LoPG or to rotate aboutits x-axis) This means that the parameters making uptSumlE found by the fitting procedure correspond to onepossible location of the eye along the LoPG Control ex-periments showed that the fitted location of the eye alongthe LoPG depended on the initial guess used for the fit-ting procedure even though these alternate initialguesses still yielded the same points of gaze5 If the in-termediate results of the model are to be used for exam-ple for computing the location of the eye relative to thescreen for placement of a camera in a virtual environ-ment the inability of this system to precisely locate theeye might be overcome by performing an additional cal-ibration For example a calibration similar to the oneused in this article but requiring the participant to placethe eye in other nonprimary positions of gaze could beused to find actual eye location This technique is cur-rently being investigated

REFERENCES

Allison R S Eizenman M amp Cheung B S K (1996) Combinedhead and eye tracking system for dynamic testing of the vestibularsystem IEEE Transactions on Biomedical Engineering 43 1073-1082

Asai K Osawa N Takahashi H amp Sugimoto Y Y (2000 March)Eye mark pointer in immersive projection display Paper presented atthe IEEE Virtual Reality 2000 Conference New Brunswick NJ

Atchison D A amp Smith G (2000) Optics of the human eye OxfordButterworth-Heinemann

Barabas J Giorgi R G Goldstein R B Apfelbaum H WoodsR L amp Peli E (2003 May) Wide field 3D gaze tracking systemPaper presented at the Association for Research in Vision and Oph-thalmology Meeting Fort Lauderdale FL

Blood E (1990) US Patent No 4945305 Washington DC USPatent amp Trademark Office

Bryson S (1992 February) Measurement and calibration of staticdistortion of position data from 3D trackers Paper presented at SPIEStereoscopic Displays amp Applications III San Jose

Caldwell R T Berbaum K S amp Borah J (2000) Correcting er-rors in eye-position data arising from the distortion of magnetic fieldsby display devices Behavior Research Methods Instruments ampComputers 32 572-578

Day J S Dumas G A amp Murdoch D J (1998) Evaluation of along-range transmitter for use with a magnetic tracking device in mo-tion analysis Journal of Biomechanics 31 957-961

Duchowski A Medlin E Cournia N Murphy H Gramopad-hye A Nair S Vorah J amp Melloy B (2002) 3-D eye move-ment analysis Behavior Research Methods Instruments amp Comput-ers 34 573-591

Ikits M Brederson J D Hansen C D amp Hollerbach J M(2001 March) An improved calibration framework for electromag-netic tracking devices Paper presented at the Virtual Reality 2001Proceedings IEEE Yokohama

Janin A Mizell D amp Caudell T (1993 September) Calibrationof head-mounted displays for augmented reality applications Paperpresented at the Virtual Reality Annual International SymposiumSeattle

Kindratenko V (1999) Calibration of electromagnetic tracking de-vices Virtual Reality Research Development amp Applications 4139-150

Kindratenko V (2000) A survey of electromagnetic position trackercalibration techniques Virtual Reality Research Development ampApplications 5 169-182

King P (1995) Integration of helmet-mounted displays into tacticalaircraft Proceedings of the Society for Information Display 26 663-668

Kuipers J B (1999) Quaternions and rotation sequences A primerwith applications to orbits aerospace and virtual reality PrincetonNJ Princeton University Press

Leigh R J amp Zee D S (1983) The neurology of eye movementsPhiladelphia Davis

Livingston M A amp State A (1997) Magnetic tracker calibrationfor improved augmented reality registration Presence Teleoperatorsamp Virtual Environments 6 532-546

Nixon M A McCallum B C Fright W R amp Price N B (1998)The effects of metals and interfering fields on electromagnetic track-ers Presence 7 204-218

Southard D A (1995) Viewing model for virtual environment dis-plays Journal of Electronic Imaging 4 413-420

Young L R amp Sheena D (1975) Survey of eye movement recordingmethods Behavior Research Methods amp Instrumentation 7 397-429

NOTES

1 This position is used in projected virtual reality displays to deter-mine how to position the virtual camera when generating an image of ascene

2 Even if the tracker is not aligned exactly with the case of the trans-mitter the correction polynomials (Equations 1 and 2) we used includeterms to account for such an offset These offsets are contained in theconstant (i 0) and linear (i 1) terms of polynomials

3 Two linear systems were constructed from 192 six-variable mea-surements Each of the two systems contained 192 3 576 knownsand 105 unknowns

4 It should be noted that the calibration procedure described heredoes not completely constrain the model parameters used Since no mea-surement of orientation on the screen of the point of gaze or distancefrom eye to screen during a sighting is made neither of these can be pre-dicted by the model As a result the best-fitting model parameters de-scribing tSumlE may converge on any one of a line of solutions corre-sponding to possible locations of the eye along the LoPG Each of theseeye locations predicts the same LoPG location on the screen It shouldbe possible to compute correct eye position with a calibration proce-dure that includes eye-in-head rotation

5 For this reason we have chosen to report errors in inches along thescreen surface since angular measures of error are more sensitive to lo-cation of the eye along the LoPG Where angular error is reported weestimated the eyendashscreen distance to be the distance between the mag-netic sensor and the screen

770 BARABAS ET AL

ARCHIVED MATERIALS

The following materials associated with this article are retrievablefrom the Psychonomic Societyrsquos Norms Stimuli and Data archivehttpwwwpsychonomicorgarchive

To access the above files or links search the archive for this articleusing the journal (Behavior Research Methods Instruments amp Com-puters) the first authorrsquos name (Barabas) and the publication year(2004)

FILE Barabas-BRMIC-2004zipDESCRIPTION The compressed archive file contains seven files The

files areREADMEtxt containing a description of scripts and data included

in the archive (a 3K text file) This file can renamed to readmem andexecuted as a Matlab script to reproduce some of the results reported inthis manuscript

barabas2004LoPGmat a Matlab workspace file containing mag-netic tracker data collected in the verification experiments described inthis manuscript (a 32K binary file)

findcorrectioncoeffsm a Matlab function for finding the distortioncompensation coefficients for Equations 1 and 2 (a 5K Matlab M-file)

correctdistortionm a Matlab function for performing distortioncompensation on magnetic tracker data using Equations 1 and 2 (a 4KMatlab M-file)

findmodelparamsm a Matlab function for finding the parameters ofthe LoPG model from data collected during a series of directed gazes (a3K Matlab M-file)

intersecterrorm a Matlab function for finding gaze point error for aset of directed gazes (a 5K Matlab M-file)

trackereulerstomtx4m a Matlab function that computes a transfor-mation matrix representation of a relationship between coordinateframes via Equation 4 (a 3K Matlab M-file)

AUTHORrsquoS WEB SITE httpwwweriharvardedufacultypeliindexhtml

AUTHORrsquoS E-MAIL ADDRESS barabasmitedu

(Manuscript received July 18 2003revision accepted for publication July 23 2004)

760 BARABAS ET AL

configuration and can be computed using the calibrationprocedure described below In agreement with the find-ings of Kindratenko (2000) we found that a polynomialof degree n 4 modeled the shape of the distortion wellA fourth-degree polynomial required t 35 coefficientsfor each tracker variable (210 in total)

Geometric transformation The second computa-tional step of the LoPG tracking algorithm transformsthe distortion-compensated magnetic tracker readings tolocate LoPG in relation to the screen This transforma-tion employs a geometric model of the spatial relation-ship between the eye and the magnetic sensor and of therelationship between the screen and the magnetic trans-mitter (Figure 3) The set of 12 model parameters com-puted in the geometric calibration procedure (describedbelow) characterizes these two spatial relationships

The model takes each of the following locations withinthe tracking environment to be the origin of a three-dimensional coordinate frame the upper left corner ofthe screen (coordinate frame O) the measurement originof the magnetic transmitter (B) the measurement centerof the magnetic sensor mounted on the head (S) and the

center of rotation of the eye (E) The x y and z axes ofeach coordinate frame are oriented as shown in Figure 3

We represent the spatial relationships between pairs ofthese coordinate frames as 4 4 homogeneous trans-formation matrices Each such matrix can describe therelative locations and orientations of two coordinateframes and allows measurements in one frame to betransformed to the other by matrix multiplication Foreach such matrix

(3)

representing the transformation from some frame a to someframe b xba yba and zba describe the three-dimensionallocation of the origin of a in the coordinate system of bThe r variables are the lengths of the projections of unitvectors along each of the three coordinate axes of framea onto the axes of frame b These projections form therotational component of the transformation

tb auml

=

Egrave

Icirc

IacuteIacuteIacuteIacute

˘

˚

˙˙˙˙

r r r x

r r r y

r r r z

ba

ba

ba

11 12 13

21 22 23

31 32 33

0 0 0 1

Figure 3 Geometric model for finding line of primary gaze by combining (via matrix multiplication) thespatial relationship measured by the magnetic tracker (ttBumlS corrected for magnetic distortion) with thosederived through geometric calibration (ttOumlB and ttSumlE) B is the tracker transmitter S is the tracker sen-sor E is the eye and O is the projection screen

TRACKING THE LINE OF PRIMARY GAZE 761

In addition to the matrix representation of the relation-ship between two coordinate frames it is often conve-nient to use a translation and Euler angle representationIn this way a transformation can be described compactlywith 6 variables instead of the 12 used in the matrix formThe three coordinates of translation and the three Eulerangles that specify a transformation Mbumla (xba yba zbaqba fba yba) can be converted to matrix form by a se-ries of three rotations and a displacement tbumla as shownin Equation 4 at the bottom of the page (Kuipers 1999)

Using these conversions the (corrected) measure-ments provided by the magnetic tracker can be used toderive the other spatial relationships described in Fig-ure 3 The computations of the various transformationsare described next followed by a description of the cal-ibration process

Since the LoPG lies along the x-axis of the eye frameE to find the LoPG relative to the screen we find the lo-cation and orientation of this frame with respect to thescreen frame O To find this relationship we composethe matrix representations of the spatial relationshipsmeasured by the magnetic tracker with matrices com-puted through calibration The position of the magnetictransmitter with respect to the screen tOumlB (transforma-tion from transmitter frame to screen frame) and that ofthe eye with respect to the sensor tSumlE remain fixed forthe duration of an experiment The transformation tBumlSis constructed using Equation 4 from the six variablesMBumlS (xc yc zc qc fc yc) resulting from the distortioncompensation step (Equations 1 and 2) The coordinatesxc yc and zc represent the corrected three-dimensionaldisplacement between the origins of B and S while qcfc and yc are the corrected Euler angles describing theorientation of S in the coordinate system of B

Since composite transformations can be representedas the product of transformation matrices we can findthe location and orientation of the eye relative to thescreen tOumlE as follows

(5)

Once tOumlE is computed we can find both the eye loca-tion with respect to the screen1 (simply the last columnof tOumlE) and the point of intersection between the LoPGand the screen

Point of gaze computation The point on the screenthat intersects the participantrsquos LoPG can be found fromtOumlE the result of the geometric transformation portionof the gaze-tracking algorithm In our model the LoPGfalls along the x-axis of the eye coordinate frame E Wesolve for a point that is on the x-axis of the eye frame and

also falls in the xndashy plane of the screen coordinate frameOmdashthe surface of the screen This intersection can beformulated as a linear system consisting of a point on thex-axis of frame E (xo 0 0) transformed by tOumlE and setequal to a point in the xndashy plane of frame O (Intx Inty 0)

(6)

Solving this system for Intx and Inty gives us the inter-section between the LoPG and the plane of the screenexpressed in the screen coordinate frame

CalibrationParameters used in the LoPG tracking algorithm are

computed via two calibration steps First the distortionin magnetic tracker readings is measured and coeffi-cients for the compensation polynomials (Equations 1and 2) are computed This distortion compensation cal-ibration process was carried out when our tracking sys-tem was installed Computed coefficients can be reusedassuming that there have been no changes to the relativeplacement of the tracker transmitter and the distortion-causing (large metal) objects The second geometric cal-ibration step computes the spatial relationship betweenthe magnetic tracker sensor and the eye along with therelationship between the tracker transmitter and the screenimage This second step is performed once for each par-ticipant before an experimental run since an adjustment ofthe headgear or of the image projected on the screen al-ters the spatial relationships measured in this calibration

Distortion compensation calibration To computecoefficients aijk bijk cijk dijk eijk and fijk of the distortioncompensation polynomials (Equations 1 and 2) we com-pare a set of tracker readings taken over a grid of knownpoints within the tracking volume to the hand-measuredlocations of those points The differences between themeasured and the reported locations form a set of vectorsmapping points in reported space to actual physical lo-cations (these difference vectors were small near thetransmitter and became largemdashsee Figure 5mdashnear theouter third of the nominal volume of our trackerrsquos range)We then solve for the set of polynomial coefficients thatwhen used in Equation 1 minimizes the sum of the squaredlengths of these error vectors An identical procedure isperformed on orientation data finding coefficients thatminimize the sum of the squared errors in Euler anglesreported by the tracker

tO E

o

uml

Egrave

Icirc

IacuteIacuteIacuteIacute

˘

˚

˙˙˙˙

=

Egrave

Icirc

IacuteIacuteIacuteIacute

˘

˚

˙˙˙˙

x Int

Intx

y0

0

1

0

1

O E O B B S S Euml uml uml uml=t t t t

(4)b auml

=

Egrave

Icirc

IacuteIacuteIacuteIacute

˘

˚

˙˙˙˙

-Egrave

Icirc

IacuteIacuteIacuteIacute

˘

˚

˙˙˙˙

-

Egrave

Icirc

IacuteIacuteIacuteIacute

˘

˚

˙t

q qq q

f f

f f

1 0 0

0 1 0

0 0 1

0 0 0 1

0 0

0 0

0 0 1 0

0 0 0 1

0 0

0 1 0 0

0 0

0 0 0 1

x

y

z

ba

ba

ba

ba ba

ba ba

ba ba

ba ba

cos( ) sin( )

sin( ) cos( )

cos( ) sin( )

sin( ) cos( )˙˙˙

-Egrave

Icirc

IacuteIacuteIacuteIacute

˘

˚

˙˙˙˙

1 0 0 0

0 0

0 0

0 0 0 1

cos( ) sin( )

sin( ) cos( )

y yy y

ba ba

ba ba

762 BARABAS ET AL

In our walking simulator environment we constructeda system of pegboards and stands to provide a set ofknown locations for placement of the magnetic sensorduring this procedure The pegboard system illustratedin Figure 4 was constructed from 24 48 in sheets ofpredrilled 025-in-thick pegboard mounted on a woodenframe Screws used in the construction of the frame werereplaced with glue since we found them to be causingsmall local distortions in magnetic tracker readings Thesensor stands were constructed from sections of 4-in-diameter PVC pipe with flat pegboard caps on each endEach stand held the magnetic tracker sensor rigidly onone end cap and had pegs for attachment to the pegboardon the other The sensor was positioned so that when thestands were engaged with the pegboard the sensorrsquosx-axis (see Figure 4) pointed toward the screen and itsz-axis pointed toward the floor In this orientation themagnetic tracker reported azimuth elevation and roll ofthe sensor (qo fo and yo) to all be near zero (these an-gles deviated from zero by a few degrees due to distor-tion) Several pipe lengths were used for measurementsat varying heights (0 6 12 and 18 in) above the peg-board plane Other devices for placing a sensor at knownlocations have also been developed including a fixturemade entirely of Plexiglas (Caldwell et al 2000) and anoptical tracking system (Ikits Brederson Hansen ampHollerbach 2001)

The pegboard frame was placed in our tracking envi-ronment and was mechanically squared with the case ofthe magnetic transmitter The location of the pegboard

grid with respect to the magnetic transmitter was thenmeasured2 Readings from the magnetic tracker weretaken over both a sparse three-dimensional grid of 64locations spanning 45 52 18 in (see Figure 5) anda denser smaller grid of 128 locations covering the cen-tral 32 40 18 in of the same space The magnetictracker readings from both grids were combined to formthe data set of reported locations and orientations UsingMatlab we performed linear least-squares fits on the dif-ferences between reported and pegboard-measured datasets to find the 210 coefficients for the distortion cor-rection polynomials3 An illustration of the magnitude oflocation distortion for our tracker is provided in Figure 5

Geometric calibration The second calibration stepcomputes the geometric model parameters describingspatial relationships tOumlB and tSumlE These parameterscorresponding to the alignment of the screen with thetracker transmitter (tOumlB) and position of the magnetictracker sensor relative to the eye (tSumlE) are difficult tomeasure directly and tSumlE changes from participant toparticipant or whenever the headgear is adjusted Forthese reasons we developed a calibration procedure toquickly ascertain these model parameters

Geometric calibration of the LoPG-tracking system isperformed by taking magnetic tracker recordings whileparticipants perform a series of constrained gazes Foreach gaze the participant using his or her left eye alignsthe head-mounted sight with a target dot projected on thescreen The participant repeats this procedure from a se-ries of standing locations within the desired usage space

Figure 4 Pegboard apparatus for measuring distortion in magnetic tracker measure-ments Two such pegboard panels were mounted on a wooden frame to calibrate a volumemeasuring 45 52 18 in Stands of various heights were placed at different locations acrossthe base pegboard to provide wide coverage of the volume

TRACKING THE LINE OF PRIMARY GAZE 763

After a number of such gazes a pair of tOumlB and tSumlEparameters is computed by the GaussndashNewton nonlinearleast-squares fitting method as will be described in thenext section

Geometric calibration algorithm The fitting proce-dure seeks the set of model parameters that when com-bined with the geometric model in our LoPG-trackingalgorithm best predict on-screen points of gaze fromdistortion-compensated magnetic tracker readings takenover a set of directed gazes Our model computes Int thetwo-dimensional on-screen point of gaze expressed as

Int F(P MBumlS) (7)

where F is the function that computes point of gaze froma given (corrected) magnetic tracker reading and a set ofmodel parameters (Equations 5 and 6) P is the set of 12model parameters characterizing via Equation 4 thespatial relationships tOumlB and tSumlE (the 12 parametersdescribe for each of the two transformations a three-dimensional displacement between coordinate framesand three Euler angles describing relative orientations ofthe coordinate frames) MBumlS is the magnetic trackermeasurement (three distances and three angles) cor-

rected for distortion by Equations 1 and 2 We assessquality of the fit found by the geometric calibration pro-cedure by the size of the average error across all gazes ina calibration This mean gaze point error is expressed asthe sum across all calibration gazes of the distancesalong the screen between the actual calibration targetsand the model-predicted points of gaze The fitting pro-cedure seeks model parameters P that minimize thissum given a set of magnetic tracker measurements MBumlSa set of corresponding screen intersections Int and aninitial ldquoguessrdquo for P Starting with the given guess for Pthe fitting procedure iteratively adjusts these parametersuntil a local minimum in mean gaze point error is found4Minimization is performed in Matlab using the nlinfitfunction (from the statistics toolbox an implementationof the GaussndashNewton fitting method) In our implemen-tation this technique finds a P corresponding to a givenset of 20 measurements of Int and MBumlS in a few sec-onds on a 700-MHz PC

Experimental ValidationTo test the calibration and LoPG-tracking techniques

presented in this article we first calibrated the distortion

y coordinate of magnetic transmitter frame (inches)x coordinate of magnetic transmitter frame (inches)

z coordinate of magnetic transm

itter frame

Actual shape ofmeasured space

Reported shape of measuredspace due to distortion

Projection

Screen

ndash90ndash80

ndash70ndash60

ndash50ndash40

ndash30ndash20

ndash100

1040

30

20

10

0

ndash10

ndash20

ndash30ndash40

ndash30

ndash20

ndash10

0

10

20

Magnetic TrackerTransmitter

Figure 5 Distortion measured in performing distortion compensation calibration of magnetic trackerAxis units are inches in the coordinate system of the magnetic tracker transmitter Dark bars on axes in-dicate the ranges of physical locations sampled in the calibration procedure Shade of surface indicatesmagnitude of distortion in location measurements from the magnetic tracker The darkest areas indicatedistortion in location measurement of over 13 in

764 BARABAS ET AL

compensation system as described above and then per-formed two target-sighting experiments In the first ex-periment we performed the geometric calibration pro-cedure using the beam of a laser pointer to represent theline of gaze of the human participant Doing so allowedthe simulated LoPG to be precisely aligned with calibra-tion targets projected on the screen and allowed testingwithout the variability of human participantsrsquo ability tomaintain head position To fix the relative placement ofthe laser and sensor both were attached to a small woodenblock The sensor was placed on the block with its x-axisapproximately parallel to and 2 in above the laser beamThis alignment enabled us to easily measure the actualspatial relationship between the sensor and the laser forvalidation of the parameters found by the geometric cali-bration procedure (We found it much more difficult tomeasure the equivalent relationship tSumlE on a humanparticipant) The wood block holding the sensor and laserwas fastened to an aluminum and plastic tripod raised tothe approximate height of a human participantrsquos head Togather data for testing the geometric calibration proce-dure the tripod was placed at 10 different locations Theselocations were chosen within the range of possible stand-ing positions of our study participants (an area about 36 36 in) For each of the 10 tripod locations readings fromthe magnetic tracker were taken panning and tilting thetripod head so that the laser beam fell on each calibrationtarget Four of the targets were just inside the corners ofthe screen one was placed in the center and five other lo-cations were arbitrarily chosen The targets remained inthe same locations throughout each experiment Due tothe distance between the sensor and the pivot point of thetripod head for a given tripod location the actual magneticsensor locations varied by up to 8 in as the tripod headwas tilted at different angles to align the laser

In a second experiment a similar set of 100 directedgazes was performed by a human participant The par-ticipant used the head-mounted sight (Figure 2) to alignhis LoPG with each of the 10 targets from each of 10standing locations The 5 calibration targets that were ar-bitrarily placed in the first experiment were in differentlocations for calibration of the human participant

Data from both experiments were independently usedto calibrate the LoPG-tracking system and the benefitof distortion compensation and geometric calibrationprocedures were examined For the distortion compen-sation step we used polynomial coefficients computedfrom the 192 pegboard locations as described above Inall cases we used as an initial guess for the geometriccalibration procedure a set of hand-measured parametersthat approximated the actual geometry of the system

RESULTS

For each of the two experiments collected data con-sisted of a set of 100 six-variable tracker readings (eachreading taken when the laserLoPG was aligned to fall atthe center of an on-screen calibration target) as well as

the 100 two-dimensional screen locations of the corre-sponding calibration targets (10 repetitions of 10 tar-gets) Location and orientation data from the magnetictracker were passed through the distortion compensationfunctions (Equations 1 and 2) yielding a second set of100 corrected tracker readings for each experiment Weused subsets of this body of distortion-corrected data toderive the transformation matrices tOumlB and tSumlE usingthe geometric calibration procedure described aboveand tested these fits against the remaining data We com-pared known screenndashtarget locations to point-of-gazecomputations made with and without distortion com-pensation or geometric calibration We also investigatedthe behavior of the error as a function of the number ofincluded calibration points by varying the size of the cal-ibration data sets used to compute tOumlB and tSumlE

Accuracy of Predicted Point of GazeUsing the distortion-corrected location and orientation

measurements from the entire set of 100 laserndashtargetalignments from the first experiment and the geometriccalibration procedure the tOumlB and tSumlE matrices werecomputed Equations 5 and 6 were then used to predictpoint of gaze for each of the 100 laserndashtarget alignmentsand mean gaze point error (mean distance along the screenbetween calibration target and point of gaze found withthe LoPG-tracking algorithm) was computed For thedata set from the first experiment mean error was 150 in(Figure 6D) Error can also be described as the angle be-tween the screen target and the computed point of gazewhen viewed from the location of the magnetic sensorMeasured in this way mean angular error was 155ordm Forreasons described in the Discussion section we will re-port the remainder of our mean error results in inches Atour screenndasheye distances 1ordm corresponds approximatelyto 1 in of error

To examine the effect of distortion compensation onmean gaze point error we also performed the geometriccalibration procedure using raw magnetic tracker mea-surements not processed by the distortion correctionpolynomials As seen in Figure 6B without distortioncompensation mean gaze point error was significantlylarger than with compensation (Wilcoxon signed rankstest Z 366 p 001) at 193 in when calibratedusing data from the same experiment (an increase of 29)Without distortion compensation clear outliers from theclusters of points of gaze can be seen for most of thescreen targets in Figure 6B These outlying points ofgaze were all from the tripod location most distant fromthe magnetic transmitter where distortion was largestDistortion compensation provided most benefit for thistripod location

To examine the effect of the geometric calibration pro-cedure on accuracy of predicted LoPG we also com-puted mean gaze point error across the collected data setusing only a hand-measured estimate of tOumlB and tSumlEDistortion correction was included but no geometriccalibration was performed As one might expect without

TRACKING THE LINE OF PRIMARY GAZE 765

calibration errors were larger and this difference wassignificant (Wilcoxon signed ranks test Z 809 p 001) Mean gaze point error using an uncalibrated modelwas 396 in as can be seen in Figure 6C

We also examined the ability of the geometric model tocompute points of gaze without both distortion compensa-tion and geometric calibration The hand-measured tOumlBand tSumlE and raw magnetic tracker readings were used inEquations 5 and 6 Mean gaze point error was 566 in (Fig-ure 6A) The addition of either geometric calibration ordistortion compensation yielded a significant reduction inerror (Wilcoxon signed ranks test Z 61 p 001)

Conducting the same analysis for the data from thesecond experiment we found mean gaze point error tobe lower than that in the first experiment when geomet-ric calibration was performed When both distortioncompensation and geometric calibration were used meanerror was 072 in When geometric calibration was per-formed on data not corrected for distortion mean error was106 in (Wilcoxon signed rank test Z 520 p 001)Reasons for the reduced errors in the second experiment

are proposed in the Discussion section Without geometriccalibration error was much larger (Wilcoxon signed ranktest Z 81 p 001) since we were unable to measuretSumlE with the same accuracy for the second experimentWith and without distortion correction mean errors were398 and 426 in respectively

Effect of sensor location on error Using the rawdata (not compensated for distortion) from the first ex-periment and computing point of gaze without perform-ing geometric calibration the farther the sensor wasfrom the transmitter the worse the error (Spearman cor-relation r 46 p 001) and that error was manifestas an increasing error in the y-dimension (Spearman cor-relation r 50 p 001 Figure 7A) When both thedistortion correction and the geometric calibration wereimplemented the errors were more evenly distributed(Figure 7B) resulting in no significant correlation withdistance from the transmitter (Spearman correlation r 04 p 71) There was however a significant tendencyfor the errors to increase toward the screen (ie in thex-dimension Spearman correlation r 22 p 003)

Figure 6 Effects of distortion compensation (panels C and D) and geometric calibration (panels B and D) oncomputation of points of gaze Data were collected with a tripod-mounted laser pointer used to simulate line ofprimary gaze (LoPG) Circular marks indicate screen location of calibration targets Crosses are laser screen in-tersections (points of gaze) computed by LoPG-tracking algorithm Within each cluster each cross represents ameasurement from a different tripod location Rectangular outline shows location of projection screen Screenoutline and targets appear at hand-measured location (panels A and C) and location found by geometric cali-bration (panels B and D)

766 BARABAS ET AL

This may be because for tripod locations closer to thescreen an angular error (due to distortion for example)of the same size results in a larger error distance alongthe screen

Short calibration In order to use our LoPG-trackingsystem as a tool for gathering data from untrained studyparticipants it is advantageous to calibrate the systemusing as few data points as necessary (eg to reduce fa-tigue) To find the number of calibration data pointsneeded to accurately predict point of gaze we createdcalibration sets from subsets of the 100 measurementscollected in each experiment We used these small cali-bration sets to compute tOumlB and tSumlE and then usedthose computed parameters to calculate screen intersec-tions across the full set of 100 measurements We thencompared these computed intersections with the loca-tions of the actual locations of the on-screen targets Weperformed this technique for calibration sets of varyingsizes across 100 random orderings of the 100 data pointscollected in each experiment For the first experimentwe found that after about 20 points the geometric cali-

bration system converges to a point of gaze with a meangaze point error of less than 2 in with only small im-provements attained by including additional points in thecalibration set (Figure 8)

For the second experiment overall gaze point errorwas lower when calibration was performed with the en-tire data set For this data set 2 in of error could bereached with about 12 data points (Figure 9) Calibra-tions performed with more than 15 data points providedonly a small additional benefit

In the processing of calibration data from both exper-iments geometric calibration occasionally failed com-pletely producing larger mean gaze point error than ifgeometric calibration had not been performed at allThese failures occurred with very small calibration setswhen distortion-related errors provided inconsistent con-straints for the model Failures also occurred when setsincluded targets from only one part of the screen insuf-ficiently constraining the model Fitting failures neveroccurred with sets of 10 points or more and never occurredwhen distortion compensation was performed In all the

Figure 7 The mean gaze point error at each of the 10 tripod locations on the left using the raw data from themagnetic tracker collected in the first experiment (see Figure 6A) and on the right using the same data correctedusing both the distortion compensation and the geometric calibration (Figure 6D) The tripod and laser pointerwere directed toward the projection screen which is shown as the tall white rectangle on the right of each panelThe tracker transmitter is shown as the black square Each circle represents one of the tripod locations with ra-dius equal to the mean gaze point error from that location Mean gaze point error for each location is also shownwith the corresponding circle

TRACKING THE LINE OF PRIMARY GAZE 767

cases in which calibration failed the addition of one ortwo calibration points to the set was enough to reversethe failure In general better fits were obtained from cal-ibration sets with greater variety in sampled tracker ori-entations and screen locations

DISCUSSION

The LoPG-tracking model and calibration techniquespresented here succeed at accurately predicting the on-screen point of gaze while participants look through ahead-mounted sight The validation experiments showthat in agreement with Caldwell et al (2000) the use ofpolynomial distortion compensation can improve the ac-curacy of gaze point calculations (Figures 6C and 6D)In addition benefit was also seen from geometric cali-bration The techniques presented for tracking LoPGprovide a foundation for future gaze-tracking systems

An interesting result of performing calibration bothwith and without distortion compensation was that al-though point of gaze was predicted more accurately withdistortion compensation the tracking system still per-formed relatively well in the presence of uncompensateddistortion provided that geometric calibration was per-formed In the presence of measurement errors causedby distortion the minimization procedure gives modelparameters that may not correspond well to the actualspatial relationships being modeled but interact with the

measurement distortion to produce reasonably accuratescreen intersection predictions If somewhat larger LoPG-tracking errors can be tolerated one might be able toachieve reasonable results by omitting distortion compen-sation altogether Omission of distortion compensationwould likely result in much greater error however for headpositions or orientations outside the range of calibration

Possible sources of gaze point errors remaining whenboth geometric calibration and distortion compensationwere performed include measurement noise inaccura-cies in magnetic tracker measurements not corrected bythe distortion compensation system inaccuracy of themodel parameters found by the geometric calibrationsystem or some combination of these Tests of stabilityin the tracker measurements showed that readings fromthe tracker taken milliseconds or hours apart agreedboth differ from reference measurements by only a fewhundredths of an inch and by a few tenths of a degreeThis small measurement noise is clearly not enough toexplain the gaze point errors which are orders of mag-nitude greater

Our method makes the simplifying assumption thatthe distortions in magnetic tracker orientation readingsdepend on location of the magnetic sensor only and noton the interaction between sensor location and orienta-tion Therefore it is likely that our method did not com-pletely correct for the magnetic distortion When themagnetic sensor is kept in a similar orientation in both

mean across all orderings95 confidence interval

14

12

10

8

6

4

2

02 10 20 30 40 50

Number of data points in calibration set

Mea

n ga

ze p

oint

err

or a

cros

s al

l 100

dat

a po

ints

(in

ches

)

Figure 8 Error in computed point of gaze derived when model was calibrated with in-creasing numbers of calibration points across 100 random orderings of the data set collectedin the first experiment (using a laser pointer) Mean gaze point error was reduced to within2 in (about 2ordm) after about 23 directed gazes

768 BARABAS ET AL

distortion compensation calibration and actual experi-mental use this approach is more likely to produce goodresults In the first experiment our method provided animprovement over omitting orientation distortion com-pensation entirely since sensor alignment was similar incalibration and experiment It is likely however that dis-tortion still plays a role in gaze point errors In the sec-ond experiment we found even lower (than that reportedin the Results section) mean gaze point error when mag-netic tracker measurements were corrected for locationdistortion but not for orientation distortion As the mag-netic tracker was rotated on its side when attached to theheadgear of the head-mounted sight (Figure 2) it was nolonger in the relative orientation needed to see a benefitfrom our compensation for orientation distortion Ex-panding the distortion compensation calibration proce-dure to measure the sensor at multiple orientations foreach grid location sampled could further reduce errordue to distortion This change would increase both thenumber of data points required and the time needed toperform the initial data collection procedure but it wouldnot impact per-participant calibration demands

The geometric calibration algorithm seeks a tOumlB andtSumlE yielding a local minimum in gaze point error start-ing from the initial guess derived from hand measure-ments of these parameters Control experiments wereperformed to verify that no other potentially lower min-ima could be found near the minima arrived at by cali-

bration Although the minimization procedure failed toconverge for intentionally inconsistent initial guessestests conducted using parameters displaced from hand-measured values by several inches in all directions allstill converged on the same LoPG Thus uncorrecteddistortion in magnetic tracker measurements was mostlikely the leading cause of gaze point error

Comparison between points of gaze computed with(Figure 6D) and without (Figure 6C) the geometric cali-bration step shows that attempts to hand measure the spa-tial relationships tOumlB and tSumlE (ie without geometriccalibration) lead to relatively large gaze point errors Ap-plication of the geometric fitting procedure substantiallyimproves the ability of the model to compute LoPG Al-though it is not surprising that better results can be ob-tained when calibration is performed it is interesting tonote that the parameters resulting from calibration deviatedconsiderably from hand-measured parameters In factthe optimized parameters comprising tSumlE indicatedspatial relationships that were clearly inconsistent withthe actual configuration of the laser and magnetic sensorFor example parameters producing the least mean gazepoint error contained values for the distance between themagnetic sensor and the LoPG that were several inchesgreater than physical measurements indicated Althoughthere is some imprecision (we estimate about a quarter ofan inch) in measuring straight-line distance between eyeand magnetic sensor this imprecision alone cannot ex-

mean across all orderings95 confidence interval

14

12

10

8

6

4

2

02 10 20 30 40 50

Number of data points in calibration set

Mea

n ga

ze p

oint

err

or a

cros

s al

l 100

dat

a po

ints

(in

ches

)

Figure 9 Error in computed point of gaze derived when model was calibrated with in-creasing numbers of calibration points across 100 random orderings of the data set collectedin the second experiment (with a human participant) Mean gaze point error was reduced toless than 2 in (about 2ordm) in about 12 directed gazes

TRACKING THE LINE OF PRIMARY GAZE 769

plain the difference Since calibration performed withdata from the human participant seemed less prone tothis inconsistency we suspect that the constraints im-posed on the movement of the laser and sensor by the tri-pod head may be responsible In positioning the laser toalign it with calibration targets the sensor remainedwithin 47ordm of level and ldquorollrdquo of the sensor could not beperformed These constraints lead to less variation in thecalibration data and in turn may have underconstrainedthe fitting procedure used in geometric calibration

A limitation of the geometric calibration procedure de-scribed here is the inability of the calibration procedureto locate the eye along the LoPG Although this limita-tion does not impact the ability of the system to predictpoints of gaze on the screen it does impact the veracityof the model parameters Although the geometric modelused specifies the eye as a coordinate frame with an ori-gin located some distance from the magnetic tracker sen-sor the geometric calibration procedure constrains onlythe x-axis of this coordinate frame This results in an in-finite number of possible sets of model parameters thatstill predict the same point of gaze on the screen (iescreen intersection remains the same even if the eye wereto slide forward or back along the LoPG or to rotate aboutits x-axis) This means that the parameters making uptSumlE found by the fitting procedure correspond to onepossible location of the eye along the LoPG Control ex-periments showed that the fitted location of the eye alongthe LoPG depended on the initial guess used for the fit-ting procedure even though these alternate initialguesses still yielded the same points of gaze5 If the in-termediate results of the model are to be used for exam-ple for computing the location of the eye relative to thescreen for placement of a camera in a virtual environ-ment the inability of this system to precisely locate theeye might be overcome by performing an additional cal-ibration For example a calibration similar to the oneused in this article but requiring the participant to placethe eye in other nonprimary positions of gaze could beused to find actual eye location This technique is cur-rently being investigated

REFERENCES

Allison R S Eizenman M amp Cheung B S K (1996) Combinedhead and eye tracking system for dynamic testing of the vestibularsystem IEEE Transactions on Biomedical Engineering 43 1073-1082

Asai K Osawa N Takahashi H amp Sugimoto Y Y (2000 March)Eye mark pointer in immersive projection display Paper presented atthe IEEE Virtual Reality 2000 Conference New Brunswick NJ

Atchison D A amp Smith G (2000) Optics of the human eye OxfordButterworth-Heinemann

Barabas J Giorgi R G Goldstein R B Apfelbaum H WoodsR L amp Peli E (2003 May) Wide field 3D gaze tracking systemPaper presented at the Association for Research in Vision and Oph-thalmology Meeting Fort Lauderdale FL

Blood E (1990) US Patent No 4945305 Washington DC USPatent amp Trademark Office

Bryson S (1992 February) Measurement and calibration of staticdistortion of position data from 3D trackers Paper presented at SPIEStereoscopic Displays amp Applications III San Jose

Caldwell R T Berbaum K S amp Borah J (2000) Correcting er-rors in eye-position data arising from the distortion of magnetic fieldsby display devices Behavior Research Methods Instruments ampComputers 32 572-578

Day J S Dumas G A amp Murdoch D J (1998) Evaluation of along-range transmitter for use with a magnetic tracking device in mo-tion analysis Journal of Biomechanics 31 957-961

Duchowski A Medlin E Cournia N Murphy H Gramopad-hye A Nair S Vorah J amp Melloy B (2002) 3-D eye move-ment analysis Behavior Research Methods Instruments amp Comput-ers 34 573-591

Ikits M Brederson J D Hansen C D amp Hollerbach J M(2001 March) An improved calibration framework for electromag-netic tracking devices Paper presented at the Virtual Reality 2001Proceedings IEEE Yokohama

Janin A Mizell D amp Caudell T (1993 September) Calibrationof head-mounted displays for augmented reality applications Paperpresented at the Virtual Reality Annual International SymposiumSeattle

Kindratenko V (1999) Calibration of electromagnetic tracking de-vices Virtual Reality Research Development amp Applications 4139-150

Kindratenko V (2000) A survey of electromagnetic position trackercalibration techniques Virtual Reality Research Development ampApplications 5 169-182

King P (1995) Integration of helmet-mounted displays into tacticalaircraft Proceedings of the Society for Information Display 26 663-668

Kuipers J B (1999) Quaternions and rotation sequences A primerwith applications to orbits aerospace and virtual reality PrincetonNJ Princeton University Press

Leigh R J amp Zee D S (1983) The neurology of eye movementsPhiladelphia Davis

Livingston M A amp State A (1997) Magnetic tracker calibrationfor improved augmented reality registration Presence Teleoperatorsamp Virtual Environments 6 532-546

Nixon M A McCallum B C Fright W R amp Price N B (1998)The effects of metals and interfering fields on electromagnetic track-ers Presence 7 204-218

Southard D A (1995) Viewing model for virtual environment dis-plays Journal of Electronic Imaging 4 413-420

Young L R amp Sheena D (1975) Survey of eye movement recordingmethods Behavior Research Methods amp Instrumentation 7 397-429

NOTES

1 This position is used in projected virtual reality displays to deter-mine how to position the virtual camera when generating an image of ascene

2 Even if the tracker is not aligned exactly with the case of the trans-mitter the correction polynomials (Equations 1 and 2) we used includeterms to account for such an offset These offsets are contained in theconstant (i 0) and linear (i 1) terms of polynomials

3 Two linear systems were constructed from 192 six-variable mea-surements Each of the two systems contained 192 3 576 knownsand 105 unknowns

4 It should be noted that the calibration procedure described heredoes not completely constrain the model parameters used Since no mea-surement of orientation on the screen of the point of gaze or distancefrom eye to screen during a sighting is made neither of these can be pre-dicted by the model As a result the best-fitting model parameters de-scribing tSumlE may converge on any one of a line of solutions corre-sponding to possible locations of the eye along the LoPG Each of theseeye locations predicts the same LoPG location on the screen It shouldbe possible to compute correct eye position with a calibration proce-dure that includes eye-in-head rotation

5 For this reason we have chosen to report errors in inches along thescreen surface since angular measures of error are more sensitive to lo-cation of the eye along the LoPG Where angular error is reported weestimated the eyendashscreen distance to be the distance between the mag-netic sensor and the screen

770 BARABAS ET AL

ARCHIVED MATERIALS

The following materials associated with this article are retrievablefrom the Psychonomic Societyrsquos Norms Stimuli and Data archivehttpwwwpsychonomicorgarchive

To access the above files or links search the archive for this articleusing the journal (Behavior Research Methods Instruments amp Com-puters) the first authorrsquos name (Barabas) and the publication year(2004)

FILE Barabas-BRMIC-2004zipDESCRIPTION The compressed archive file contains seven files The

files areREADMEtxt containing a description of scripts and data included

in the archive (a 3K text file) This file can renamed to readmem andexecuted as a Matlab script to reproduce some of the results reported inthis manuscript

barabas2004LoPGmat a Matlab workspace file containing mag-netic tracker data collected in the verification experiments described inthis manuscript (a 32K binary file)

findcorrectioncoeffsm a Matlab function for finding the distortioncompensation coefficients for Equations 1 and 2 (a 5K Matlab M-file)

correctdistortionm a Matlab function for performing distortioncompensation on magnetic tracker data using Equations 1 and 2 (a 4KMatlab M-file)

findmodelparamsm a Matlab function for finding the parameters ofthe LoPG model from data collected during a series of directed gazes (a3K Matlab M-file)

intersecterrorm a Matlab function for finding gaze point error for aset of directed gazes (a 5K Matlab M-file)

trackereulerstomtx4m a Matlab function that computes a transfor-mation matrix representation of a relationship between coordinateframes via Equation 4 (a 3K Matlab M-file)

AUTHORrsquoS WEB SITE httpwwweriharvardedufacultypeliindexhtml

AUTHORrsquoS E-MAIL ADDRESS barabasmitedu

(Manuscript received July 18 2003revision accepted for publication July 23 2004)

TRACKING THE LINE OF PRIMARY GAZE 761

In addition to the matrix representation of the relation-ship between two coordinate frames it is often conve-nient to use a translation and Euler angle representationIn this way a transformation can be described compactlywith 6 variables instead of the 12 used in the matrix formThe three coordinates of translation and the three Eulerangles that specify a transformation Mbumla (xba yba zbaqba fba yba) can be converted to matrix form by a se-ries of three rotations and a displacement tbumla as shownin Equation 4 at the bottom of the page (Kuipers 1999)

Using these conversions the (corrected) measure-ments provided by the magnetic tracker can be used toderive the other spatial relationships described in Fig-ure 3 The computations of the various transformationsare described next followed by a description of the cal-ibration process

Since the LoPG lies along the x-axis of the eye frameE to find the LoPG relative to the screen we find the lo-cation and orientation of this frame with respect to thescreen frame O To find this relationship we composethe matrix representations of the spatial relationshipsmeasured by the magnetic tracker with matrices com-puted through calibration The position of the magnetictransmitter with respect to the screen tOumlB (transforma-tion from transmitter frame to screen frame) and that ofthe eye with respect to the sensor tSumlE remain fixed forthe duration of an experiment The transformation tBumlSis constructed using Equation 4 from the six variablesMBumlS (xc yc zc qc fc yc) resulting from the distortioncompensation step (Equations 1 and 2) The coordinatesxc yc and zc represent the corrected three-dimensionaldisplacement between the origins of B and S while qcfc and yc are the corrected Euler angles describing theorientation of S in the coordinate system of B

Since composite transformations can be representedas the product of transformation matrices we can findthe location and orientation of the eye relative to thescreen tOumlE as follows

(5)

Once tOumlE is computed we can find both the eye loca-tion with respect to the screen1 (simply the last columnof tOumlE) and the point of intersection between the LoPGand the screen

Point of gaze computation The point on the screenthat intersects the participantrsquos LoPG can be found fromtOumlE the result of the geometric transformation portionof the gaze-tracking algorithm In our model the LoPGfalls along the x-axis of the eye coordinate frame E Wesolve for a point that is on the x-axis of the eye frame and

also falls in the xndashy plane of the screen coordinate frameOmdashthe surface of the screen This intersection can beformulated as a linear system consisting of a point on thex-axis of frame E (xo 0 0) transformed by tOumlE and setequal to a point in the xndashy plane of frame O (Intx Inty 0)

(6)

Solving this system for Intx and Inty gives us the inter-section between the LoPG and the plane of the screenexpressed in the screen coordinate frame

CalibrationParameters used in the LoPG tracking algorithm are

computed via two calibration steps First the distortionin magnetic tracker readings is measured and coeffi-cients for the compensation polynomials (Equations 1and 2) are computed This distortion compensation cal-ibration process was carried out when our tracking sys-tem was installed Computed coefficients can be reusedassuming that there have been no changes to the relativeplacement of the tracker transmitter and the distortion-causing (large metal) objects The second geometric cal-ibration step computes the spatial relationship betweenthe magnetic tracker sensor and the eye along with therelationship between the tracker transmitter and the screenimage This second step is performed once for each par-ticipant before an experimental run since an adjustment ofthe headgear or of the image projected on the screen al-ters the spatial relationships measured in this calibration

Distortion compensation calibration To computecoefficients aijk bijk cijk dijk eijk and fijk of the distortioncompensation polynomials (Equations 1 and 2) we com-pare a set of tracker readings taken over a grid of knownpoints within the tracking volume to the hand-measuredlocations of those points The differences between themeasured and the reported locations form a set of vectorsmapping points in reported space to actual physical lo-cations (these difference vectors were small near thetransmitter and became largemdashsee Figure 5mdashnear theouter third of the nominal volume of our trackerrsquos range)We then solve for the set of polynomial coefficients thatwhen used in Equation 1 minimizes the sum of the squaredlengths of these error vectors An identical procedure isperformed on orientation data finding coefficients thatminimize the sum of the squared errors in Euler anglesreported by the tracker

tO E

o

uml

Egrave

Icirc

IacuteIacuteIacuteIacute

˘

˚

˙˙˙˙

=

Egrave

Icirc

IacuteIacuteIacuteIacute

˘

˚

˙˙˙˙

x Int

Intx

y0

0

1

0

1

O E O B B S S Euml uml uml uml=t t t t

(4)b auml

=

Egrave

Icirc

IacuteIacuteIacuteIacute

˘

˚

˙˙˙˙

-Egrave

Icirc

IacuteIacuteIacuteIacute

˘

˚

˙˙˙˙

-

Egrave

Icirc

IacuteIacuteIacuteIacute

˘

˚

˙t

q qq q

f f

f f

1 0 0

0 1 0

0 0 1

0 0 0 1

0 0

0 0

0 0 1 0

0 0 0 1

0 0

0 1 0 0

0 0

0 0 0 1

x

y

z

ba

ba

ba

ba ba

ba ba

ba ba

ba ba

cos( ) sin( )

sin( ) cos( )

cos( ) sin( )

sin( ) cos( )˙˙˙

-Egrave

Icirc

IacuteIacuteIacuteIacute

˘

˚

˙˙˙˙

1 0 0 0

0 0

0 0

0 0 0 1

cos( ) sin( )

sin( ) cos( )

y yy y

ba ba

ba ba

762 BARABAS ET AL

In our walking simulator environment we constructeda system of pegboards and stands to provide a set ofknown locations for placement of the magnetic sensorduring this procedure The pegboard system illustratedin Figure 4 was constructed from 24 48 in sheets ofpredrilled 025-in-thick pegboard mounted on a woodenframe Screws used in the construction of the frame werereplaced with glue since we found them to be causingsmall local distortions in magnetic tracker readings Thesensor stands were constructed from sections of 4-in-diameter PVC pipe with flat pegboard caps on each endEach stand held the magnetic tracker sensor rigidly onone end cap and had pegs for attachment to the pegboardon the other The sensor was positioned so that when thestands were engaged with the pegboard the sensorrsquosx-axis (see Figure 4) pointed toward the screen and itsz-axis pointed toward the floor In this orientation themagnetic tracker reported azimuth elevation and roll ofthe sensor (qo fo and yo) to all be near zero (these an-gles deviated from zero by a few degrees due to distor-tion) Several pipe lengths were used for measurementsat varying heights (0 6 12 and 18 in) above the peg-board plane Other devices for placing a sensor at knownlocations have also been developed including a fixturemade entirely of Plexiglas (Caldwell et al 2000) and anoptical tracking system (Ikits Brederson Hansen ampHollerbach 2001)

The pegboard frame was placed in our tracking envi-ronment and was mechanically squared with the case ofthe magnetic transmitter The location of the pegboard

grid with respect to the magnetic transmitter was thenmeasured2 Readings from the magnetic tracker weretaken over both a sparse three-dimensional grid of 64locations spanning 45 52 18 in (see Figure 5) anda denser smaller grid of 128 locations covering the cen-tral 32 40 18 in of the same space The magnetictracker readings from both grids were combined to formthe data set of reported locations and orientations UsingMatlab we performed linear least-squares fits on the dif-ferences between reported and pegboard-measured datasets to find the 210 coefficients for the distortion cor-rection polynomials3 An illustration of the magnitude oflocation distortion for our tracker is provided in Figure 5

Geometric calibration The second calibration stepcomputes the geometric model parameters describingspatial relationships tOumlB and tSumlE These parameterscorresponding to the alignment of the screen with thetracker transmitter (tOumlB) and position of the magnetictracker sensor relative to the eye (tSumlE) are difficult tomeasure directly and tSumlE changes from participant toparticipant or whenever the headgear is adjusted Forthese reasons we developed a calibration procedure toquickly ascertain these model parameters

Geometric calibration of the LoPG-tracking system isperformed by taking magnetic tracker recordings whileparticipants perform a series of constrained gazes Foreach gaze the participant using his or her left eye alignsthe head-mounted sight with a target dot projected on thescreen The participant repeats this procedure from a se-ries of standing locations within the desired usage space

Figure 4 Pegboard apparatus for measuring distortion in magnetic tracker measure-ments Two such pegboard panels were mounted on a wooden frame to calibrate a volumemeasuring 45 52 18 in Stands of various heights were placed at different locations acrossthe base pegboard to provide wide coverage of the volume

TRACKING THE LINE OF PRIMARY GAZE 763

After a number of such gazes a pair of tOumlB and tSumlEparameters is computed by the GaussndashNewton nonlinearleast-squares fitting method as will be described in thenext section

Geometric calibration algorithm The fitting proce-dure seeks the set of model parameters that when com-bined with the geometric model in our LoPG-trackingalgorithm best predict on-screen points of gaze fromdistortion-compensated magnetic tracker readings takenover a set of directed gazes Our model computes Int thetwo-dimensional on-screen point of gaze expressed as

Int F(P MBumlS) (7)

where F is the function that computes point of gaze froma given (corrected) magnetic tracker reading and a set ofmodel parameters (Equations 5 and 6) P is the set of 12model parameters characterizing via Equation 4 thespatial relationships tOumlB and tSumlE (the 12 parametersdescribe for each of the two transformations a three-dimensional displacement between coordinate framesand three Euler angles describing relative orientations ofthe coordinate frames) MBumlS is the magnetic trackermeasurement (three distances and three angles) cor-

rected for distortion by Equations 1 and 2 We assessquality of the fit found by the geometric calibration pro-cedure by the size of the average error across all gazes ina calibration This mean gaze point error is expressed asthe sum across all calibration gazes of the distancesalong the screen between the actual calibration targetsand the model-predicted points of gaze The fitting pro-cedure seeks model parameters P that minimize thissum given a set of magnetic tracker measurements MBumlSa set of corresponding screen intersections Int and aninitial ldquoguessrdquo for P Starting with the given guess for Pthe fitting procedure iteratively adjusts these parametersuntil a local minimum in mean gaze point error is found4Minimization is performed in Matlab using the nlinfitfunction (from the statistics toolbox an implementationof the GaussndashNewton fitting method) In our implemen-tation this technique finds a P corresponding to a givenset of 20 measurements of Int and MBumlS in a few sec-onds on a 700-MHz PC

Experimental ValidationTo test the calibration and LoPG-tracking techniques

presented in this article we first calibrated the distortion

y coordinate of magnetic transmitter frame (inches)x coordinate of magnetic transmitter frame (inches)

z coordinate of magnetic transm

itter frame

Actual shape ofmeasured space

Reported shape of measuredspace due to distortion

Projection

Screen

ndash90ndash80

ndash70ndash60

ndash50ndash40

ndash30ndash20

ndash100

1040

30

20

10

0

ndash10

ndash20

ndash30ndash40

ndash30

ndash20

ndash10

0

10

20

Magnetic TrackerTransmitter

Figure 5 Distortion measured in performing distortion compensation calibration of magnetic trackerAxis units are inches in the coordinate system of the magnetic tracker transmitter Dark bars on axes in-dicate the ranges of physical locations sampled in the calibration procedure Shade of surface indicatesmagnitude of distortion in location measurements from the magnetic tracker The darkest areas indicatedistortion in location measurement of over 13 in

764 BARABAS ET AL

compensation system as described above and then per-formed two target-sighting experiments In the first ex-periment we performed the geometric calibration pro-cedure using the beam of a laser pointer to represent theline of gaze of the human participant Doing so allowedthe simulated LoPG to be precisely aligned with calibra-tion targets projected on the screen and allowed testingwithout the variability of human participantsrsquo ability tomaintain head position To fix the relative placement ofthe laser and sensor both were attached to a small woodenblock The sensor was placed on the block with its x-axisapproximately parallel to and 2 in above the laser beamThis alignment enabled us to easily measure the actualspatial relationship between the sensor and the laser forvalidation of the parameters found by the geometric cali-bration procedure (We found it much more difficult tomeasure the equivalent relationship tSumlE on a humanparticipant) The wood block holding the sensor and laserwas fastened to an aluminum and plastic tripod raised tothe approximate height of a human participantrsquos head Togather data for testing the geometric calibration proce-dure the tripod was placed at 10 different locations Theselocations were chosen within the range of possible stand-ing positions of our study participants (an area about 36 36 in) For each of the 10 tripod locations readings fromthe magnetic tracker were taken panning and tilting thetripod head so that the laser beam fell on each calibrationtarget Four of the targets were just inside the corners ofthe screen one was placed in the center and five other lo-cations were arbitrarily chosen The targets remained inthe same locations throughout each experiment Due tothe distance between the sensor and the pivot point of thetripod head for a given tripod location the actual magneticsensor locations varied by up to 8 in as the tripod headwas tilted at different angles to align the laser

In a second experiment a similar set of 100 directedgazes was performed by a human participant The par-ticipant used the head-mounted sight (Figure 2) to alignhis LoPG with each of the 10 targets from each of 10standing locations The 5 calibration targets that were ar-bitrarily placed in the first experiment were in differentlocations for calibration of the human participant

Data from both experiments were independently usedto calibrate the LoPG-tracking system and the benefitof distortion compensation and geometric calibrationprocedures were examined For the distortion compen-sation step we used polynomial coefficients computedfrom the 192 pegboard locations as described above Inall cases we used as an initial guess for the geometriccalibration procedure a set of hand-measured parametersthat approximated the actual geometry of the system

RESULTS

For each of the two experiments collected data con-sisted of a set of 100 six-variable tracker readings (eachreading taken when the laserLoPG was aligned to fall atthe center of an on-screen calibration target) as well as

the 100 two-dimensional screen locations of the corre-sponding calibration targets (10 repetitions of 10 tar-gets) Location and orientation data from the magnetictracker were passed through the distortion compensationfunctions (Equations 1 and 2) yielding a second set of100 corrected tracker readings for each experiment Weused subsets of this body of distortion-corrected data toderive the transformation matrices tOumlB and tSumlE usingthe geometric calibration procedure described aboveand tested these fits against the remaining data We com-pared known screenndashtarget locations to point-of-gazecomputations made with and without distortion com-pensation or geometric calibration We also investigatedthe behavior of the error as a function of the number ofincluded calibration points by varying the size of the cal-ibration data sets used to compute tOumlB and tSumlE

Accuracy of Predicted Point of GazeUsing the distortion-corrected location and orientation

measurements from the entire set of 100 laserndashtargetalignments from the first experiment and the geometriccalibration procedure the tOumlB and tSumlE matrices werecomputed Equations 5 and 6 were then used to predictpoint of gaze for each of the 100 laserndashtarget alignmentsand mean gaze point error (mean distance along the screenbetween calibration target and point of gaze found withthe LoPG-tracking algorithm) was computed For thedata set from the first experiment mean error was 150 in(Figure 6D) Error can also be described as the angle be-tween the screen target and the computed point of gazewhen viewed from the location of the magnetic sensorMeasured in this way mean angular error was 155ordm Forreasons described in the Discussion section we will re-port the remainder of our mean error results in inches Atour screenndasheye distances 1ordm corresponds approximatelyto 1 in of error

To examine the effect of distortion compensation onmean gaze point error we also performed the geometriccalibration procedure using raw magnetic tracker mea-surements not processed by the distortion correctionpolynomials As seen in Figure 6B without distortioncompensation mean gaze point error was significantlylarger than with compensation (Wilcoxon signed rankstest Z 366 p 001) at 193 in when calibratedusing data from the same experiment (an increase of 29)Without distortion compensation clear outliers from theclusters of points of gaze can be seen for most of thescreen targets in Figure 6B These outlying points ofgaze were all from the tripod location most distant fromthe magnetic transmitter where distortion was largestDistortion compensation provided most benefit for thistripod location

To examine the effect of the geometric calibration pro-cedure on accuracy of predicted LoPG we also com-puted mean gaze point error across the collected data setusing only a hand-measured estimate of tOumlB and tSumlEDistortion correction was included but no geometriccalibration was performed As one might expect without

TRACKING THE LINE OF PRIMARY GAZE 765

calibration errors were larger and this difference wassignificant (Wilcoxon signed ranks test Z 809 p 001) Mean gaze point error using an uncalibrated modelwas 396 in as can be seen in Figure 6C

We also examined the ability of the geometric model tocompute points of gaze without both distortion compensa-tion and geometric calibration The hand-measured tOumlBand tSumlE and raw magnetic tracker readings were used inEquations 5 and 6 Mean gaze point error was 566 in (Fig-ure 6A) The addition of either geometric calibration ordistortion compensation yielded a significant reduction inerror (Wilcoxon signed ranks test Z 61 p 001)

Conducting the same analysis for the data from thesecond experiment we found mean gaze point error tobe lower than that in the first experiment when geomet-ric calibration was performed When both distortioncompensation and geometric calibration were used meanerror was 072 in When geometric calibration was per-formed on data not corrected for distortion mean error was106 in (Wilcoxon signed rank test Z 520 p 001)Reasons for the reduced errors in the second experiment

are proposed in the Discussion section Without geometriccalibration error was much larger (Wilcoxon signed ranktest Z 81 p 001) since we were unable to measuretSumlE with the same accuracy for the second experimentWith and without distortion correction mean errors were398 and 426 in respectively

Effect of sensor location on error Using the rawdata (not compensated for distortion) from the first ex-periment and computing point of gaze without perform-ing geometric calibration the farther the sensor wasfrom the transmitter the worse the error (Spearman cor-relation r 46 p 001) and that error was manifestas an increasing error in the y-dimension (Spearman cor-relation r 50 p 001 Figure 7A) When both thedistortion correction and the geometric calibration wereimplemented the errors were more evenly distributed(Figure 7B) resulting in no significant correlation withdistance from the transmitter (Spearman correlation r 04 p 71) There was however a significant tendencyfor the errors to increase toward the screen (ie in thex-dimension Spearman correlation r 22 p 003)

Figure 6 Effects of distortion compensation (panels C and D) and geometric calibration (panels B and D) oncomputation of points of gaze Data were collected with a tripod-mounted laser pointer used to simulate line ofprimary gaze (LoPG) Circular marks indicate screen location of calibration targets Crosses are laser screen in-tersections (points of gaze) computed by LoPG-tracking algorithm Within each cluster each cross represents ameasurement from a different tripod location Rectangular outline shows location of projection screen Screenoutline and targets appear at hand-measured location (panels A and C) and location found by geometric cali-bration (panels B and D)

766 BARABAS ET AL

This may be because for tripod locations closer to thescreen an angular error (due to distortion for example)of the same size results in a larger error distance alongthe screen

Short calibration In order to use our LoPG-trackingsystem as a tool for gathering data from untrained studyparticipants it is advantageous to calibrate the systemusing as few data points as necessary (eg to reduce fa-tigue) To find the number of calibration data pointsneeded to accurately predict point of gaze we createdcalibration sets from subsets of the 100 measurementscollected in each experiment We used these small cali-bration sets to compute tOumlB and tSumlE and then usedthose computed parameters to calculate screen intersec-tions across the full set of 100 measurements We thencompared these computed intersections with the loca-tions of the actual locations of the on-screen targets Weperformed this technique for calibration sets of varyingsizes across 100 random orderings of the 100 data pointscollected in each experiment For the first experimentwe found that after about 20 points the geometric cali-

bration system converges to a point of gaze with a meangaze point error of less than 2 in with only small im-provements attained by including additional points in thecalibration set (Figure 8)

For the second experiment overall gaze point errorwas lower when calibration was performed with the en-tire data set For this data set 2 in of error could bereached with about 12 data points (Figure 9) Calibra-tions performed with more than 15 data points providedonly a small additional benefit

In the processing of calibration data from both exper-iments geometric calibration occasionally failed com-pletely producing larger mean gaze point error than ifgeometric calibration had not been performed at allThese failures occurred with very small calibration setswhen distortion-related errors provided inconsistent con-straints for the model Failures also occurred when setsincluded targets from only one part of the screen insuf-ficiently constraining the model Fitting failures neveroccurred with sets of 10 points or more and never occurredwhen distortion compensation was performed In all the

Figure 7 The mean gaze point error at each of the 10 tripod locations on the left using the raw data from themagnetic tracker collected in the first experiment (see Figure 6A) and on the right using the same data correctedusing both the distortion compensation and the geometric calibration (Figure 6D) The tripod and laser pointerwere directed toward the projection screen which is shown as the tall white rectangle on the right of each panelThe tracker transmitter is shown as the black square Each circle represents one of the tripod locations with ra-dius equal to the mean gaze point error from that location Mean gaze point error for each location is also shownwith the corresponding circle

TRACKING THE LINE OF PRIMARY GAZE 767

cases in which calibration failed the addition of one ortwo calibration points to the set was enough to reversethe failure In general better fits were obtained from cal-ibration sets with greater variety in sampled tracker ori-entations and screen locations

DISCUSSION

The LoPG-tracking model and calibration techniquespresented here succeed at accurately predicting the on-screen point of gaze while participants look through ahead-mounted sight The validation experiments showthat in agreement with Caldwell et al (2000) the use ofpolynomial distortion compensation can improve the ac-curacy of gaze point calculations (Figures 6C and 6D)In addition benefit was also seen from geometric cali-bration The techniques presented for tracking LoPGprovide a foundation for future gaze-tracking systems

An interesting result of performing calibration bothwith and without distortion compensation was that al-though point of gaze was predicted more accurately withdistortion compensation the tracking system still per-formed relatively well in the presence of uncompensateddistortion provided that geometric calibration was per-formed In the presence of measurement errors causedby distortion the minimization procedure gives modelparameters that may not correspond well to the actualspatial relationships being modeled but interact with the

measurement distortion to produce reasonably accuratescreen intersection predictions If somewhat larger LoPG-tracking errors can be tolerated one might be able toachieve reasonable results by omitting distortion compen-sation altogether Omission of distortion compensationwould likely result in much greater error however for headpositions or orientations outside the range of calibration

Possible sources of gaze point errors remaining whenboth geometric calibration and distortion compensationwere performed include measurement noise inaccura-cies in magnetic tracker measurements not corrected bythe distortion compensation system inaccuracy of themodel parameters found by the geometric calibrationsystem or some combination of these Tests of stabilityin the tracker measurements showed that readings fromthe tracker taken milliseconds or hours apart agreedboth differ from reference measurements by only a fewhundredths of an inch and by a few tenths of a degreeThis small measurement noise is clearly not enough toexplain the gaze point errors which are orders of mag-nitude greater

Our method makes the simplifying assumption thatthe distortions in magnetic tracker orientation readingsdepend on location of the magnetic sensor only and noton the interaction between sensor location and orienta-tion Therefore it is likely that our method did not com-pletely correct for the magnetic distortion When themagnetic sensor is kept in a similar orientation in both

mean across all orderings95 confidence interval

14

12

10

8

6

4

2

02 10 20 30 40 50

Number of data points in calibration set

Mea

n ga

ze p

oint

err

or a

cros

s al

l 100

dat

a po

ints

(in

ches

)

Figure 8 Error in computed point of gaze derived when model was calibrated with in-creasing numbers of calibration points across 100 random orderings of the data set collectedin the first experiment (using a laser pointer) Mean gaze point error was reduced to within2 in (about 2ordm) after about 23 directed gazes

768 BARABAS ET AL

distortion compensation calibration and actual experi-mental use this approach is more likely to produce goodresults In the first experiment our method provided animprovement over omitting orientation distortion com-pensation entirely since sensor alignment was similar incalibration and experiment It is likely however that dis-tortion still plays a role in gaze point errors In the sec-ond experiment we found even lower (than that reportedin the Results section) mean gaze point error when mag-netic tracker measurements were corrected for locationdistortion but not for orientation distortion As the mag-netic tracker was rotated on its side when attached to theheadgear of the head-mounted sight (Figure 2) it was nolonger in the relative orientation needed to see a benefitfrom our compensation for orientation distortion Ex-panding the distortion compensation calibration proce-dure to measure the sensor at multiple orientations foreach grid location sampled could further reduce errordue to distortion This change would increase both thenumber of data points required and the time needed toperform the initial data collection procedure but it wouldnot impact per-participant calibration demands

The geometric calibration algorithm seeks a tOumlB andtSumlE yielding a local minimum in gaze point error start-ing from the initial guess derived from hand measure-ments of these parameters Control experiments wereperformed to verify that no other potentially lower min-ima could be found near the minima arrived at by cali-

bration Although the minimization procedure failed toconverge for intentionally inconsistent initial guessestests conducted using parameters displaced from hand-measured values by several inches in all directions allstill converged on the same LoPG Thus uncorrecteddistortion in magnetic tracker measurements was mostlikely the leading cause of gaze point error

Comparison between points of gaze computed with(Figure 6D) and without (Figure 6C) the geometric cali-bration step shows that attempts to hand measure the spa-tial relationships tOumlB and tSumlE (ie without geometriccalibration) lead to relatively large gaze point errors Ap-plication of the geometric fitting procedure substantiallyimproves the ability of the model to compute LoPG Al-though it is not surprising that better results can be ob-tained when calibration is performed it is interesting tonote that the parameters resulting from calibration deviatedconsiderably from hand-measured parameters In factthe optimized parameters comprising tSumlE indicatedspatial relationships that were clearly inconsistent withthe actual configuration of the laser and magnetic sensorFor example parameters producing the least mean gazepoint error contained values for the distance between themagnetic sensor and the LoPG that were several inchesgreater than physical measurements indicated Althoughthere is some imprecision (we estimate about a quarter ofan inch) in measuring straight-line distance between eyeand magnetic sensor this imprecision alone cannot ex-

mean across all orderings95 confidence interval

14

12

10

8

6

4

2

02 10 20 30 40 50

Number of data points in calibration set

Mea

n ga

ze p

oint

err

or a

cros

s al

l 100

dat

a po

ints

(in

ches

)

Figure 9 Error in computed point of gaze derived when model was calibrated with in-creasing numbers of calibration points across 100 random orderings of the data set collectedin the second experiment (with a human participant) Mean gaze point error was reduced toless than 2 in (about 2ordm) in about 12 directed gazes

TRACKING THE LINE OF PRIMARY GAZE 769

plain the difference Since calibration performed withdata from the human participant seemed less prone tothis inconsistency we suspect that the constraints im-posed on the movement of the laser and sensor by the tri-pod head may be responsible In positioning the laser toalign it with calibration targets the sensor remainedwithin 47ordm of level and ldquorollrdquo of the sensor could not beperformed These constraints lead to less variation in thecalibration data and in turn may have underconstrainedthe fitting procedure used in geometric calibration

A limitation of the geometric calibration procedure de-scribed here is the inability of the calibration procedureto locate the eye along the LoPG Although this limita-tion does not impact the ability of the system to predictpoints of gaze on the screen it does impact the veracityof the model parameters Although the geometric modelused specifies the eye as a coordinate frame with an ori-gin located some distance from the magnetic tracker sen-sor the geometric calibration procedure constrains onlythe x-axis of this coordinate frame This results in an in-finite number of possible sets of model parameters thatstill predict the same point of gaze on the screen (iescreen intersection remains the same even if the eye wereto slide forward or back along the LoPG or to rotate aboutits x-axis) This means that the parameters making uptSumlE found by the fitting procedure correspond to onepossible location of the eye along the LoPG Control ex-periments showed that the fitted location of the eye alongthe LoPG depended on the initial guess used for the fit-ting procedure even though these alternate initialguesses still yielded the same points of gaze5 If the in-termediate results of the model are to be used for exam-ple for computing the location of the eye relative to thescreen for placement of a camera in a virtual environ-ment the inability of this system to precisely locate theeye might be overcome by performing an additional cal-ibration For example a calibration similar to the oneused in this article but requiring the participant to placethe eye in other nonprimary positions of gaze could beused to find actual eye location This technique is cur-rently being investigated

REFERENCES

Allison R S Eizenman M amp Cheung B S K (1996) Combinedhead and eye tracking system for dynamic testing of the vestibularsystem IEEE Transactions on Biomedical Engineering 43 1073-1082

Asai K Osawa N Takahashi H amp Sugimoto Y Y (2000 March)Eye mark pointer in immersive projection display Paper presented atthe IEEE Virtual Reality 2000 Conference New Brunswick NJ

Atchison D A amp Smith G (2000) Optics of the human eye OxfordButterworth-Heinemann

Barabas J Giorgi R G Goldstein R B Apfelbaum H WoodsR L amp Peli E (2003 May) Wide field 3D gaze tracking systemPaper presented at the Association for Research in Vision and Oph-thalmology Meeting Fort Lauderdale FL

Blood E (1990) US Patent No 4945305 Washington DC USPatent amp Trademark Office

Bryson S (1992 February) Measurement and calibration of staticdistortion of position data from 3D trackers Paper presented at SPIEStereoscopic Displays amp Applications III San Jose

Caldwell R T Berbaum K S amp Borah J (2000) Correcting er-rors in eye-position data arising from the distortion of magnetic fieldsby display devices Behavior Research Methods Instruments ampComputers 32 572-578

Day J S Dumas G A amp Murdoch D J (1998) Evaluation of along-range transmitter for use with a magnetic tracking device in mo-tion analysis Journal of Biomechanics 31 957-961

Duchowski A Medlin E Cournia N Murphy H Gramopad-hye A Nair S Vorah J amp Melloy B (2002) 3-D eye move-ment analysis Behavior Research Methods Instruments amp Comput-ers 34 573-591

Ikits M Brederson J D Hansen C D amp Hollerbach J M(2001 March) An improved calibration framework for electromag-netic tracking devices Paper presented at the Virtual Reality 2001Proceedings IEEE Yokohama

Janin A Mizell D amp Caudell T (1993 September) Calibrationof head-mounted displays for augmented reality applications Paperpresented at the Virtual Reality Annual International SymposiumSeattle

Kindratenko V (1999) Calibration of electromagnetic tracking de-vices Virtual Reality Research Development amp Applications 4139-150

Kindratenko V (2000) A survey of electromagnetic position trackercalibration techniques Virtual Reality Research Development ampApplications 5 169-182

King P (1995) Integration of helmet-mounted displays into tacticalaircraft Proceedings of the Society for Information Display 26 663-668

Kuipers J B (1999) Quaternions and rotation sequences A primerwith applications to orbits aerospace and virtual reality PrincetonNJ Princeton University Press

Leigh R J amp Zee D S (1983) The neurology of eye movementsPhiladelphia Davis

Livingston M A amp State A (1997) Magnetic tracker calibrationfor improved augmented reality registration Presence Teleoperatorsamp Virtual Environments 6 532-546

Nixon M A McCallum B C Fright W R amp Price N B (1998)The effects of metals and interfering fields on electromagnetic track-ers Presence 7 204-218

Southard D A (1995) Viewing model for virtual environment dis-plays Journal of Electronic Imaging 4 413-420

Young L R amp Sheena D (1975) Survey of eye movement recordingmethods Behavior Research Methods amp Instrumentation 7 397-429

NOTES

1 This position is used in projected virtual reality displays to deter-mine how to position the virtual camera when generating an image of ascene

2 Even if the tracker is not aligned exactly with the case of the trans-mitter the correction polynomials (Equations 1 and 2) we used includeterms to account for such an offset These offsets are contained in theconstant (i 0) and linear (i 1) terms of polynomials

3 Two linear systems were constructed from 192 six-variable mea-surements Each of the two systems contained 192 3 576 knownsand 105 unknowns

4 It should be noted that the calibration procedure described heredoes not completely constrain the model parameters used Since no mea-surement of orientation on the screen of the point of gaze or distancefrom eye to screen during a sighting is made neither of these can be pre-dicted by the model As a result the best-fitting model parameters de-scribing tSumlE may converge on any one of a line of solutions corre-sponding to possible locations of the eye along the LoPG Each of theseeye locations predicts the same LoPG location on the screen It shouldbe possible to compute correct eye position with a calibration proce-dure that includes eye-in-head rotation

5 For this reason we have chosen to report errors in inches along thescreen surface since angular measures of error are more sensitive to lo-cation of the eye along the LoPG Where angular error is reported weestimated the eyendashscreen distance to be the distance between the mag-netic sensor and the screen

770 BARABAS ET AL

ARCHIVED MATERIALS

The following materials associated with this article are retrievablefrom the Psychonomic Societyrsquos Norms Stimuli and Data archivehttpwwwpsychonomicorgarchive

To access the above files or links search the archive for this articleusing the journal (Behavior Research Methods Instruments amp Com-puters) the first authorrsquos name (Barabas) and the publication year(2004)

FILE Barabas-BRMIC-2004zipDESCRIPTION The compressed archive file contains seven files The

files areREADMEtxt containing a description of scripts and data included

in the archive (a 3K text file) This file can renamed to readmem andexecuted as a Matlab script to reproduce some of the results reported inthis manuscript

barabas2004LoPGmat a Matlab workspace file containing mag-netic tracker data collected in the verification experiments described inthis manuscript (a 32K binary file)

findcorrectioncoeffsm a Matlab function for finding the distortioncompensation coefficients for Equations 1 and 2 (a 5K Matlab M-file)

correctdistortionm a Matlab function for performing distortioncompensation on magnetic tracker data using Equations 1 and 2 (a 4KMatlab M-file)

findmodelparamsm a Matlab function for finding the parameters ofthe LoPG model from data collected during a series of directed gazes (a3K Matlab M-file)

intersecterrorm a Matlab function for finding gaze point error for aset of directed gazes (a 5K Matlab M-file)

trackereulerstomtx4m a Matlab function that computes a transfor-mation matrix representation of a relationship between coordinateframes via Equation 4 (a 3K Matlab M-file)

AUTHORrsquoS WEB SITE httpwwweriharvardedufacultypeliindexhtml

AUTHORrsquoS E-MAIL ADDRESS barabasmitedu

(Manuscript received July 18 2003revision accepted for publication July 23 2004)

762 BARABAS ET AL

In our walking simulator environment we constructeda system of pegboards and stands to provide a set ofknown locations for placement of the magnetic sensorduring this procedure The pegboard system illustratedin Figure 4 was constructed from 24 48 in sheets ofpredrilled 025-in-thick pegboard mounted on a woodenframe Screws used in the construction of the frame werereplaced with glue since we found them to be causingsmall local distortions in magnetic tracker readings Thesensor stands were constructed from sections of 4-in-diameter PVC pipe with flat pegboard caps on each endEach stand held the magnetic tracker sensor rigidly onone end cap and had pegs for attachment to the pegboardon the other The sensor was positioned so that when thestands were engaged with the pegboard the sensorrsquosx-axis (see Figure 4) pointed toward the screen and itsz-axis pointed toward the floor In this orientation themagnetic tracker reported azimuth elevation and roll ofthe sensor (qo fo and yo) to all be near zero (these an-gles deviated from zero by a few degrees due to distor-tion) Several pipe lengths were used for measurementsat varying heights (0 6 12 and 18 in) above the peg-board plane Other devices for placing a sensor at knownlocations have also been developed including a fixturemade entirely of Plexiglas (Caldwell et al 2000) and anoptical tracking system (Ikits Brederson Hansen ampHollerbach 2001)

The pegboard frame was placed in our tracking envi-ronment and was mechanically squared with the case ofthe magnetic transmitter The location of the pegboard

grid with respect to the magnetic transmitter was thenmeasured2 Readings from the magnetic tracker weretaken over both a sparse three-dimensional grid of 64locations spanning 45 52 18 in (see Figure 5) anda denser smaller grid of 128 locations covering the cen-tral 32 40 18 in of the same space The magnetictracker readings from both grids were combined to formthe data set of reported locations and orientations UsingMatlab we performed linear least-squares fits on the dif-ferences between reported and pegboard-measured datasets to find the 210 coefficients for the distortion cor-rection polynomials3 An illustration of the magnitude oflocation distortion for our tracker is provided in Figure 5

Geometric calibration The second calibration stepcomputes the geometric model parameters describingspatial relationships tOumlB and tSumlE These parameterscorresponding to the alignment of the screen with thetracker transmitter (tOumlB) and position of the magnetictracker sensor relative to the eye (tSumlE) are difficult tomeasure directly and tSumlE changes from participant toparticipant or whenever the headgear is adjusted Forthese reasons we developed a calibration procedure toquickly ascertain these model parameters

Geometric calibration of the LoPG-tracking system isperformed by taking magnetic tracker recordings whileparticipants perform a series of constrained gazes Foreach gaze the participant using his or her left eye alignsthe head-mounted sight with a target dot projected on thescreen The participant repeats this procedure from a se-ries of standing locations within the desired usage space

Figure 4 Pegboard apparatus for measuring distortion in magnetic tracker measure-ments Two such pegboard panels were mounted on a wooden frame to calibrate a volumemeasuring 45 52 18 in Stands of various heights were placed at different locations acrossthe base pegboard to provide wide coverage of the volume

TRACKING THE LINE OF PRIMARY GAZE 763

After a number of such gazes a pair of tOumlB and tSumlEparameters is computed by the GaussndashNewton nonlinearleast-squares fitting method as will be described in thenext section

Geometric calibration algorithm The fitting proce-dure seeks the set of model parameters that when com-bined with the geometric model in our LoPG-trackingalgorithm best predict on-screen points of gaze fromdistortion-compensated magnetic tracker readings takenover a set of directed gazes Our model computes Int thetwo-dimensional on-screen point of gaze expressed as

Int F(P MBumlS) (7)

where F is the function that computes point of gaze froma given (corrected) magnetic tracker reading and a set ofmodel parameters (Equations 5 and 6) P is the set of 12model parameters characterizing via Equation 4 thespatial relationships tOumlB and tSumlE (the 12 parametersdescribe for each of the two transformations a three-dimensional displacement between coordinate framesand three Euler angles describing relative orientations ofthe coordinate frames) MBumlS is the magnetic trackermeasurement (three distances and three angles) cor-

rected for distortion by Equations 1 and 2 We assessquality of the fit found by the geometric calibration pro-cedure by the size of the average error across all gazes ina calibration This mean gaze point error is expressed asthe sum across all calibration gazes of the distancesalong the screen between the actual calibration targetsand the model-predicted points of gaze The fitting pro-cedure seeks model parameters P that minimize thissum given a set of magnetic tracker measurements MBumlSa set of corresponding screen intersections Int and aninitial ldquoguessrdquo for P Starting with the given guess for Pthe fitting procedure iteratively adjusts these parametersuntil a local minimum in mean gaze point error is found4Minimization is performed in Matlab using the nlinfitfunction (from the statistics toolbox an implementationof the GaussndashNewton fitting method) In our implemen-tation this technique finds a P corresponding to a givenset of 20 measurements of Int and MBumlS in a few sec-onds on a 700-MHz PC

Experimental ValidationTo test the calibration and LoPG-tracking techniques

presented in this article we first calibrated the distortion

y coordinate of magnetic transmitter frame (inches)x coordinate of magnetic transmitter frame (inches)

z coordinate of magnetic transm

itter frame

Actual shape ofmeasured space

Reported shape of measuredspace due to distortion

Projection

Screen

ndash90ndash80

ndash70ndash60

ndash50ndash40

ndash30ndash20

ndash100

1040

30

20

10

0

ndash10

ndash20

ndash30ndash40

ndash30

ndash20

ndash10

0

10

20

Magnetic TrackerTransmitter

Figure 5 Distortion measured in performing distortion compensation calibration of magnetic trackerAxis units are inches in the coordinate system of the magnetic tracker transmitter Dark bars on axes in-dicate the ranges of physical locations sampled in the calibration procedure Shade of surface indicatesmagnitude of distortion in location measurements from the magnetic tracker The darkest areas indicatedistortion in location measurement of over 13 in

764 BARABAS ET AL

compensation system as described above and then per-formed two target-sighting experiments In the first ex-periment we performed the geometric calibration pro-cedure using the beam of a laser pointer to represent theline of gaze of the human participant Doing so allowedthe simulated LoPG to be precisely aligned with calibra-tion targets projected on the screen and allowed testingwithout the variability of human participantsrsquo ability tomaintain head position To fix the relative placement ofthe laser and sensor both were attached to a small woodenblock The sensor was placed on the block with its x-axisapproximately parallel to and 2 in above the laser beamThis alignment enabled us to easily measure the actualspatial relationship between the sensor and the laser forvalidation of the parameters found by the geometric cali-bration procedure (We found it much more difficult tomeasure the equivalent relationship tSumlE on a humanparticipant) The wood block holding the sensor and laserwas fastened to an aluminum and plastic tripod raised tothe approximate height of a human participantrsquos head Togather data for testing the geometric calibration proce-dure the tripod was placed at 10 different locations Theselocations were chosen within the range of possible stand-ing positions of our study participants (an area about 36 36 in) For each of the 10 tripod locations readings fromthe magnetic tracker were taken panning and tilting thetripod head so that the laser beam fell on each calibrationtarget Four of the targets were just inside the corners ofthe screen one was placed in the center and five other lo-cations were arbitrarily chosen The targets remained inthe same locations throughout each experiment Due tothe distance between the sensor and the pivot point of thetripod head for a given tripod location the actual magneticsensor locations varied by up to 8 in as the tripod headwas tilted at different angles to align the laser

In a second experiment a similar set of 100 directedgazes was performed by a human participant The par-ticipant used the head-mounted sight (Figure 2) to alignhis LoPG with each of the 10 targets from each of 10standing locations The 5 calibration targets that were ar-bitrarily placed in the first experiment were in differentlocations for calibration of the human participant

Data from both experiments were independently usedto calibrate the LoPG-tracking system and the benefitof distortion compensation and geometric calibrationprocedures were examined For the distortion compen-sation step we used polynomial coefficients computedfrom the 192 pegboard locations as described above Inall cases we used as an initial guess for the geometriccalibration procedure a set of hand-measured parametersthat approximated the actual geometry of the system

RESULTS

For each of the two experiments collected data con-sisted of a set of 100 six-variable tracker readings (eachreading taken when the laserLoPG was aligned to fall atthe center of an on-screen calibration target) as well as

the 100 two-dimensional screen locations of the corre-sponding calibration targets (10 repetitions of 10 tar-gets) Location and orientation data from the magnetictracker were passed through the distortion compensationfunctions (Equations 1 and 2) yielding a second set of100 corrected tracker readings for each experiment Weused subsets of this body of distortion-corrected data toderive the transformation matrices tOumlB and tSumlE usingthe geometric calibration procedure described aboveand tested these fits against the remaining data We com-pared known screenndashtarget locations to point-of-gazecomputations made with and without distortion com-pensation or geometric calibration We also investigatedthe behavior of the error as a function of the number ofincluded calibration points by varying the size of the cal-ibration data sets used to compute tOumlB and tSumlE

Accuracy of Predicted Point of GazeUsing the distortion-corrected location and orientation

measurements from the entire set of 100 laserndashtargetalignments from the first experiment and the geometriccalibration procedure the tOumlB and tSumlE matrices werecomputed Equations 5 and 6 were then used to predictpoint of gaze for each of the 100 laserndashtarget alignmentsand mean gaze point error (mean distance along the screenbetween calibration target and point of gaze found withthe LoPG-tracking algorithm) was computed For thedata set from the first experiment mean error was 150 in(Figure 6D) Error can also be described as the angle be-tween the screen target and the computed point of gazewhen viewed from the location of the magnetic sensorMeasured in this way mean angular error was 155ordm Forreasons described in the Discussion section we will re-port the remainder of our mean error results in inches Atour screenndasheye distances 1ordm corresponds approximatelyto 1 in of error

To examine the effect of distortion compensation onmean gaze point error we also performed the geometriccalibration procedure using raw magnetic tracker mea-surements not processed by the distortion correctionpolynomials As seen in Figure 6B without distortioncompensation mean gaze point error was significantlylarger than with compensation (Wilcoxon signed rankstest Z 366 p 001) at 193 in when calibratedusing data from the same experiment (an increase of 29)Without distortion compensation clear outliers from theclusters of points of gaze can be seen for most of thescreen targets in Figure 6B These outlying points ofgaze were all from the tripod location most distant fromthe magnetic transmitter where distortion was largestDistortion compensation provided most benefit for thistripod location

To examine the effect of the geometric calibration pro-cedure on accuracy of predicted LoPG we also com-puted mean gaze point error across the collected data setusing only a hand-measured estimate of tOumlB and tSumlEDistortion correction was included but no geometriccalibration was performed As one might expect without

TRACKING THE LINE OF PRIMARY GAZE 765

calibration errors were larger and this difference wassignificant (Wilcoxon signed ranks test Z 809 p 001) Mean gaze point error using an uncalibrated modelwas 396 in as can be seen in Figure 6C

We also examined the ability of the geometric model tocompute points of gaze without both distortion compensa-tion and geometric calibration The hand-measured tOumlBand tSumlE and raw magnetic tracker readings were used inEquations 5 and 6 Mean gaze point error was 566 in (Fig-ure 6A) The addition of either geometric calibration ordistortion compensation yielded a significant reduction inerror (Wilcoxon signed ranks test Z 61 p 001)

Conducting the same analysis for the data from thesecond experiment we found mean gaze point error tobe lower than that in the first experiment when geomet-ric calibration was performed When both distortioncompensation and geometric calibration were used meanerror was 072 in When geometric calibration was per-formed on data not corrected for distortion mean error was106 in (Wilcoxon signed rank test Z 520 p 001)Reasons for the reduced errors in the second experiment

are proposed in the Discussion section Without geometriccalibration error was much larger (Wilcoxon signed ranktest Z 81 p 001) since we were unable to measuretSumlE with the same accuracy for the second experimentWith and without distortion correction mean errors were398 and 426 in respectively

Effect of sensor location on error Using the rawdata (not compensated for distortion) from the first ex-periment and computing point of gaze without perform-ing geometric calibration the farther the sensor wasfrom the transmitter the worse the error (Spearman cor-relation r 46 p 001) and that error was manifestas an increasing error in the y-dimension (Spearman cor-relation r 50 p 001 Figure 7A) When both thedistortion correction and the geometric calibration wereimplemented the errors were more evenly distributed(Figure 7B) resulting in no significant correlation withdistance from the transmitter (Spearman correlation r 04 p 71) There was however a significant tendencyfor the errors to increase toward the screen (ie in thex-dimension Spearman correlation r 22 p 003)

Figure 6 Effects of distortion compensation (panels C and D) and geometric calibration (panels B and D) oncomputation of points of gaze Data were collected with a tripod-mounted laser pointer used to simulate line ofprimary gaze (LoPG) Circular marks indicate screen location of calibration targets Crosses are laser screen in-tersections (points of gaze) computed by LoPG-tracking algorithm Within each cluster each cross represents ameasurement from a different tripod location Rectangular outline shows location of projection screen Screenoutline and targets appear at hand-measured location (panels A and C) and location found by geometric cali-bration (panels B and D)

766 BARABAS ET AL

This may be because for tripod locations closer to thescreen an angular error (due to distortion for example)of the same size results in a larger error distance alongthe screen

Short calibration In order to use our LoPG-trackingsystem as a tool for gathering data from untrained studyparticipants it is advantageous to calibrate the systemusing as few data points as necessary (eg to reduce fa-tigue) To find the number of calibration data pointsneeded to accurately predict point of gaze we createdcalibration sets from subsets of the 100 measurementscollected in each experiment We used these small cali-bration sets to compute tOumlB and tSumlE and then usedthose computed parameters to calculate screen intersec-tions across the full set of 100 measurements We thencompared these computed intersections with the loca-tions of the actual locations of the on-screen targets Weperformed this technique for calibration sets of varyingsizes across 100 random orderings of the 100 data pointscollected in each experiment For the first experimentwe found that after about 20 points the geometric cali-

bration system converges to a point of gaze with a meangaze point error of less than 2 in with only small im-provements attained by including additional points in thecalibration set (Figure 8)

For the second experiment overall gaze point errorwas lower when calibration was performed with the en-tire data set For this data set 2 in of error could bereached with about 12 data points (Figure 9) Calibra-tions performed with more than 15 data points providedonly a small additional benefit

In the processing of calibration data from both exper-iments geometric calibration occasionally failed com-pletely producing larger mean gaze point error than ifgeometric calibration had not been performed at allThese failures occurred with very small calibration setswhen distortion-related errors provided inconsistent con-straints for the model Failures also occurred when setsincluded targets from only one part of the screen insuf-ficiently constraining the model Fitting failures neveroccurred with sets of 10 points or more and never occurredwhen distortion compensation was performed In all the

Figure 7 The mean gaze point error at each of the 10 tripod locations on the left using the raw data from themagnetic tracker collected in the first experiment (see Figure 6A) and on the right using the same data correctedusing both the distortion compensation and the geometric calibration (Figure 6D) The tripod and laser pointerwere directed toward the projection screen which is shown as the tall white rectangle on the right of each panelThe tracker transmitter is shown as the black square Each circle represents one of the tripod locations with ra-dius equal to the mean gaze point error from that location Mean gaze point error for each location is also shownwith the corresponding circle

TRACKING THE LINE OF PRIMARY GAZE 767

cases in which calibration failed the addition of one ortwo calibration points to the set was enough to reversethe failure In general better fits were obtained from cal-ibration sets with greater variety in sampled tracker ori-entations and screen locations

DISCUSSION

The LoPG-tracking model and calibration techniquespresented here succeed at accurately predicting the on-screen point of gaze while participants look through ahead-mounted sight The validation experiments showthat in agreement with Caldwell et al (2000) the use ofpolynomial distortion compensation can improve the ac-curacy of gaze point calculations (Figures 6C and 6D)In addition benefit was also seen from geometric cali-bration The techniques presented for tracking LoPGprovide a foundation for future gaze-tracking systems

An interesting result of performing calibration bothwith and without distortion compensation was that al-though point of gaze was predicted more accurately withdistortion compensation the tracking system still per-formed relatively well in the presence of uncompensateddistortion provided that geometric calibration was per-formed In the presence of measurement errors causedby distortion the minimization procedure gives modelparameters that may not correspond well to the actualspatial relationships being modeled but interact with the

measurement distortion to produce reasonably accuratescreen intersection predictions If somewhat larger LoPG-tracking errors can be tolerated one might be able toachieve reasonable results by omitting distortion compen-sation altogether Omission of distortion compensationwould likely result in much greater error however for headpositions or orientations outside the range of calibration

Possible sources of gaze point errors remaining whenboth geometric calibration and distortion compensationwere performed include measurement noise inaccura-cies in magnetic tracker measurements not corrected bythe distortion compensation system inaccuracy of themodel parameters found by the geometric calibrationsystem or some combination of these Tests of stabilityin the tracker measurements showed that readings fromthe tracker taken milliseconds or hours apart agreedboth differ from reference measurements by only a fewhundredths of an inch and by a few tenths of a degreeThis small measurement noise is clearly not enough toexplain the gaze point errors which are orders of mag-nitude greater

Our method makes the simplifying assumption thatthe distortions in magnetic tracker orientation readingsdepend on location of the magnetic sensor only and noton the interaction between sensor location and orienta-tion Therefore it is likely that our method did not com-pletely correct for the magnetic distortion When themagnetic sensor is kept in a similar orientation in both

mean across all orderings95 confidence interval

14

12

10

8

6

4

2

02 10 20 30 40 50

Number of data points in calibration set

Mea

n ga

ze p

oint

err

or a

cros

s al

l 100

dat

a po

ints

(in

ches

)

Figure 8 Error in computed point of gaze derived when model was calibrated with in-creasing numbers of calibration points across 100 random orderings of the data set collectedin the first experiment (using a laser pointer) Mean gaze point error was reduced to within2 in (about 2ordm) after about 23 directed gazes

768 BARABAS ET AL

distortion compensation calibration and actual experi-mental use this approach is more likely to produce goodresults In the first experiment our method provided animprovement over omitting orientation distortion com-pensation entirely since sensor alignment was similar incalibration and experiment It is likely however that dis-tortion still plays a role in gaze point errors In the sec-ond experiment we found even lower (than that reportedin the Results section) mean gaze point error when mag-netic tracker measurements were corrected for locationdistortion but not for orientation distortion As the mag-netic tracker was rotated on its side when attached to theheadgear of the head-mounted sight (Figure 2) it was nolonger in the relative orientation needed to see a benefitfrom our compensation for orientation distortion Ex-panding the distortion compensation calibration proce-dure to measure the sensor at multiple orientations foreach grid location sampled could further reduce errordue to distortion This change would increase both thenumber of data points required and the time needed toperform the initial data collection procedure but it wouldnot impact per-participant calibration demands

The geometric calibration algorithm seeks a tOumlB andtSumlE yielding a local minimum in gaze point error start-ing from the initial guess derived from hand measure-ments of these parameters Control experiments wereperformed to verify that no other potentially lower min-ima could be found near the minima arrived at by cali-

bration Although the minimization procedure failed toconverge for intentionally inconsistent initial guessestests conducted using parameters displaced from hand-measured values by several inches in all directions allstill converged on the same LoPG Thus uncorrecteddistortion in magnetic tracker measurements was mostlikely the leading cause of gaze point error

Comparison between points of gaze computed with(Figure 6D) and without (Figure 6C) the geometric cali-bration step shows that attempts to hand measure the spa-tial relationships tOumlB and tSumlE (ie without geometriccalibration) lead to relatively large gaze point errors Ap-plication of the geometric fitting procedure substantiallyimproves the ability of the model to compute LoPG Al-though it is not surprising that better results can be ob-tained when calibration is performed it is interesting tonote that the parameters resulting from calibration deviatedconsiderably from hand-measured parameters In factthe optimized parameters comprising tSumlE indicatedspatial relationships that were clearly inconsistent withthe actual configuration of the laser and magnetic sensorFor example parameters producing the least mean gazepoint error contained values for the distance between themagnetic sensor and the LoPG that were several inchesgreater than physical measurements indicated Althoughthere is some imprecision (we estimate about a quarter ofan inch) in measuring straight-line distance between eyeand magnetic sensor this imprecision alone cannot ex-

mean across all orderings95 confidence interval

14

12

10

8

6

4

2

02 10 20 30 40 50

Number of data points in calibration set

Mea

n ga

ze p

oint

err

or a

cros

s al

l 100

dat

a po

ints

(in

ches

)

Figure 9 Error in computed point of gaze derived when model was calibrated with in-creasing numbers of calibration points across 100 random orderings of the data set collectedin the second experiment (with a human participant) Mean gaze point error was reduced toless than 2 in (about 2ordm) in about 12 directed gazes

TRACKING THE LINE OF PRIMARY GAZE 769

plain the difference Since calibration performed withdata from the human participant seemed less prone tothis inconsistency we suspect that the constraints im-posed on the movement of the laser and sensor by the tri-pod head may be responsible In positioning the laser toalign it with calibration targets the sensor remainedwithin 47ordm of level and ldquorollrdquo of the sensor could not beperformed These constraints lead to less variation in thecalibration data and in turn may have underconstrainedthe fitting procedure used in geometric calibration

A limitation of the geometric calibration procedure de-scribed here is the inability of the calibration procedureto locate the eye along the LoPG Although this limita-tion does not impact the ability of the system to predictpoints of gaze on the screen it does impact the veracityof the model parameters Although the geometric modelused specifies the eye as a coordinate frame with an ori-gin located some distance from the magnetic tracker sen-sor the geometric calibration procedure constrains onlythe x-axis of this coordinate frame This results in an in-finite number of possible sets of model parameters thatstill predict the same point of gaze on the screen (iescreen intersection remains the same even if the eye wereto slide forward or back along the LoPG or to rotate aboutits x-axis) This means that the parameters making uptSumlE found by the fitting procedure correspond to onepossible location of the eye along the LoPG Control ex-periments showed that the fitted location of the eye alongthe LoPG depended on the initial guess used for the fit-ting procedure even though these alternate initialguesses still yielded the same points of gaze5 If the in-termediate results of the model are to be used for exam-ple for computing the location of the eye relative to thescreen for placement of a camera in a virtual environ-ment the inability of this system to precisely locate theeye might be overcome by performing an additional cal-ibration For example a calibration similar to the oneused in this article but requiring the participant to placethe eye in other nonprimary positions of gaze could beused to find actual eye location This technique is cur-rently being investigated

REFERENCES

Allison R S Eizenman M amp Cheung B S K (1996) Combinedhead and eye tracking system for dynamic testing of the vestibularsystem IEEE Transactions on Biomedical Engineering 43 1073-1082

Asai K Osawa N Takahashi H amp Sugimoto Y Y (2000 March)Eye mark pointer in immersive projection display Paper presented atthe IEEE Virtual Reality 2000 Conference New Brunswick NJ

Atchison D A amp Smith G (2000) Optics of the human eye OxfordButterworth-Heinemann

Barabas J Giorgi R G Goldstein R B Apfelbaum H WoodsR L amp Peli E (2003 May) Wide field 3D gaze tracking systemPaper presented at the Association for Research in Vision and Oph-thalmology Meeting Fort Lauderdale FL

Blood E (1990) US Patent No 4945305 Washington DC USPatent amp Trademark Office

Bryson S (1992 February) Measurement and calibration of staticdistortion of position data from 3D trackers Paper presented at SPIEStereoscopic Displays amp Applications III San Jose

Caldwell R T Berbaum K S amp Borah J (2000) Correcting er-rors in eye-position data arising from the distortion of magnetic fieldsby display devices Behavior Research Methods Instruments ampComputers 32 572-578

Day J S Dumas G A amp Murdoch D J (1998) Evaluation of along-range transmitter for use with a magnetic tracking device in mo-tion analysis Journal of Biomechanics 31 957-961

Duchowski A Medlin E Cournia N Murphy H Gramopad-hye A Nair S Vorah J amp Melloy B (2002) 3-D eye move-ment analysis Behavior Research Methods Instruments amp Comput-ers 34 573-591

Ikits M Brederson J D Hansen C D amp Hollerbach J M(2001 March) An improved calibration framework for electromag-netic tracking devices Paper presented at the Virtual Reality 2001Proceedings IEEE Yokohama

Janin A Mizell D amp Caudell T (1993 September) Calibrationof head-mounted displays for augmented reality applications Paperpresented at the Virtual Reality Annual International SymposiumSeattle

Kindratenko V (1999) Calibration of electromagnetic tracking de-vices Virtual Reality Research Development amp Applications 4139-150

Kindratenko V (2000) A survey of electromagnetic position trackercalibration techniques Virtual Reality Research Development ampApplications 5 169-182

King P (1995) Integration of helmet-mounted displays into tacticalaircraft Proceedings of the Society for Information Display 26 663-668

Kuipers J B (1999) Quaternions and rotation sequences A primerwith applications to orbits aerospace and virtual reality PrincetonNJ Princeton University Press

Leigh R J amp Zee D S (1983) The neurology of eye movementsPhiladelphia Davis

Livingston M A amp State A (1997) Magnetic tracker calibrationfor improved augmented reality registration Presence Teleoperatorsamp Virtual Environments 6 532-546

Nixon M A McCallum B C Fright W R amp Price N B (1998)The effects of metals and interfering fields on electromagnetic track-ers Presence 7 204-218

Southard D A (1995) Viewing model for virtual environment dis-plays Journal of Electronic Imaging 4 413-420

Young L R amp Sheena D (1975) Survey of eye movement recordingmethods Behavior Research Methods amp Instrumentation 7 397-429

NOTES

1 This position is used in projected virtual reality displays to deter-mine how to position the virtual camera when generating an image of ascene

2 Even if the tracker is not aligned exactly with the case of the trans-mitter the correction polynomials (Equations 1 and 2) we used includeterms to account for such an offset These offsets are contained in theconstant (i 0) and linear (i 1) terms of polynomials

3 Two linear systems were constructed from 192 six-variable mea-surements Each of the two systems contained 192 3 576 knownsand 105 unknowns

4 It should be noted that the calibration procedure described heredoes not completely constrain the model parameters used Since no mea-surement of orientation on the screen of the point of gaze or distancefrom eye to screen during a sighting is made neither of these can be pre-dicted by the model As a result the best-fitting model parameters de-scribing tSumlE may converge on any one of a line of solutions corre-sponding to possible locations of the eye along the LoPG Each of theseeye locations predicts the same LoPG location on the screen It shouldbe possible to compute correct eye position with a calibration proce-dure that includes eye-in-head rotation

5 For this reason we have chosen to report errors in inches along thescreen surface since angular measures of error are more sensitive to lo-cation of the eye along the LoPG Where angular error is reported weestimated the eyendashscreen distance to be the distance between the mag-netic sensor and the screen

770 BARABAS ET AL

ARCHIVED MATERIALS

The following materials associated with this article are retrievablefrom the Psychonomic Societyrsquos Norms Stimuli and Data archivehttpwwwpsychonomicorgarchive

To access the above files or links search the archive for this articleusing the journal (Behavior Research Methods Instruments amp Com-puters) the first authorrsquos name (Barabas) and the publication year(2004)

FILE Barabas-BRMIC-2004zipDESCRIPTION The compressed archive file contains seven files The

files areREADMEtxt containing a description of scripts and data included

in the archive (a 3K text file) This file can renamed to readmem andexecuted as a Matlab script to reproduce some of the results reported inthis manuscript

barabas2004LoPGmat a Matlab workspace file containing mag-netic tracker data collected in the verification experiments described inthis manuscript (a 32K binary file)

findcorrectioncoeffsm a Matlab function for finding the distortioncompensation coefficients for Equations 1 and 2 (a 5K Matlab M-file)

correctdistortionm a Matlab function for performing distortioncompensation on magnetic tracker data using Equations 1 and 2 (a 4KMatlab M-file)

findmodelparamsm a Matlab function for finding the parameters ofthe LoPG model from data collected during a series of directed gazes (a3K Matlab M-file)

intersecterrorm a Matlab function for finding gaze point error for aset of directed gazes (a 5K Matlab M-file)

trackereulerstomtx4m a Matlab function that computes a transfor-mation matrix representation of a relationship between coordinateframes via Equation 4 (a 3K Matlab M-file)

AUTHORrsquoS WEB SITE httpwwweriharvardedufacultypeliindexhtml

AUTHORrsquoS E-MAIL ADDRESS barabasmitedu

(Manuscript received July 18 2003revision accepted for publication July 23 2004)

TRACKING THE LINE OF PRIMARY GAZE 763

After a number of such gazes a pair of tOumlB and tSumlEparameters is computed by the GaussndashNewton nonlinearleast-squares fitting method as will be described in thenext section

Geometric calibration algorithm The fitting proce-dure seeks the set of model parameters that when com-bined with the geometric model in our LoPG-trackingalgorithm best predict on-screen points of gaze fromdistortion-compensated magnetic tracker readings takenover a set of directed gazes Our model computes Int thetwo-dimensional on-screen point of gaze expressed as

Int F(P MBumlS) (7)

where F is the function that computes point of gaze froma given (corrected) magnetic tracker reading and a set ofmodel parameters (Equations 5 and 6) P is the set of 12model parameters characterizing via Equation 4 thespatial relationships tOumlB and tSumlE (the 12 parametersdescribe for each of the two transformations a three-dimensional displacement between coordinate framesand three Euler angles describing relative orientations ofthe coordinate frames) MBumlS is the magnetic trackermeasurement (three distances and three angles) cor-

rected for distortion by Equations 1 and 2 We assessquality of the fit found by the geometric calibration pro-cedure by the size of the average error across all gazes ina calibration This mean gaze point error is expressed asthe sum across all calibration gazes of the distancesalong the screen between the actual calibration targetsand the model-predicted points of gaze The fitting pro-cedure seeks model parameters P that minimize thissum given a set of magnetic tracker measurements MBumlSa set of corresponding screen intersections Int and aninitial ldquoguessrdquo for P Starting with the given guess for Pthe fitting procedure iteratively adjusts these parametersuntil a local minimum in mean gaze point error is found4Minimization is performed in Matlab using the nlinfitfunction (from the statistics toolbox an implementationof the GaussndashNewton fitting method) In our implemen-tation this technique finds a P corresponding to a givenset of 20 measurements of Int and MBumlS in a few sec-onds on a 700-MHz PC

Experimental ValidationTo test the calibration and LoPG-tracking techniques

presented in this article we first calibrated the distortion

y coordinate of magnetic transmitter frame (inches)x coordinate of magnetic transmitter frame (inches)

z coordinate of magnetic transm

itter frame

Actual shape ofmeasured space

Reported shape of measuredspace due to distortion

Projection

Screen

ndash90ndash80

ndash70ndash60

ndash50ndash40

ndash30ndash20

ndash100

1040

30

20

10

0

ndash10

ndash20

ndash30ndash40

ndash30

ndash20

ndash10

0

10

20

Magnetic TrackerTransmitter

Figure 5 Distortion measured in performing distortion compensation calibration of magnetic trackerAxis units are inches in the coordinate system of the magnetic tracker transmitter Dark bars on axes in-dicate the ranges of physical locations sampled in the calibration procedure Shade of surface indicatesmagnitude of distortion in location measurements from the magnetic tracker The darkest areas indicatedistortion in location measurement of over 13 in

764 BARABAS ET AL

compensation system as described above and then per-formed two target-sighting experiments In the first ex-periment we performed the geometric calibration pro-cedure using the beam of a laser pointer to represent theline of gaze of the human participant Doing so allowedthe simulated LoPG to be precisely aligned with calibra-tion targets projected on the screen and allowed testingwithout the variability of human participantsrsquo ability tomaintain head position To fix the relative placement ofthe laser and sensor both were attached to a small woodenblock The sensor was placed on the block with its x-axisapproximately parallel to and 2 in above the laser beamThis alignment enabled us to easily measure the actualspatial relationship between the sensor and the laser forvalidation of the parameters found by the geometric cali-bration procedure (We found it much more difficult tomeasure the equivalent relationship tSumlE on a humanparticipant) The wood block holding the sensor and laserwas fastened to an aluminum and plastic tripod raised tothe approximate height of a human participantrsquos head Togather data for testing the geometric calibration proce-dure the tripod was placed at 10 different locations Theselocations were chosen within the range of possible stand-ing positions of our study participants (an area about 36 36 in) For each of the 10 tripod locations readings fromthe magnetic tracker were taken panning and tilting thetripod head so that the laser beam fell on each calibrationtarget Four of the targets were just inside the corners ofthe screen one was placed in the center and five other lo-cations were arbitrarily chosen The targets remained inthe same locations throughout each experiment Due tothe distance between the sensor and the pivot point of thetripod head for a given tripod location the actual magneticsensor locations varied by up to 8 in as the tripod headwas tilted at different angles to align the laser

In a second experiment a similar set of 100 directedgazes was performed by a human participant The par-ticipant used the head-mounted sight (Figure 2) to alignhis LoPG with each of the 10 targets from each of 10standing locations The 5 calibration targets that were ar-bitrarily placed in the first experiment were in differentlocations for calibration of the human participant

Data from both experiments were independently usedto calibrate the LoPG-tracking system and the benefitof distortion compensation and geometric calibrationprocedures were examined For the distortion compen-sation step we used polynomial coefficients computedfrom the 192 pegboard locations as described above Inall cases we used as an initial guess for the geometriccalibration procedure a set of hand-measured parametersthat approximated the actual geometry of the system

RESULTS

For each of the two experiments collected data con-sisted of a set of 100 six-variable tracker readings (eachreading taken when the laserLoPG was aligned to fall atthe center of an on-screen calibration target) as well as

the 100 two-dimensional screen locations of the corre-sponding calibration targets (10 repetitions of 10 tar-gets) Location and orientation data from the magnetictracker were passed through the distortion compensationfunctions (Equations 1 and 2) yielding a second set of100 corrected tracker readings for each experiment Weused subsets of this body of distortion-corrected data toderive the transformation matrices tOumlB and tSumlE usingthe geometric calibration procedure described aboveand tested these fits against the remaining data We com-pared known screenndashtarget locations to point-of-gazecomputations made with and without distortion com-pensation or geometric calibration We also investigatedthe behavior of the error as a function of the number ofincluded calibration points by varying the size of the cal-ibration data sets used to compute tOumlB and tSumlE

Accuracy of Predicted Point of GazeUsing the distortion-corrected location and orientation

measurements from the entire set of 100 laserndashtargetalignments from the first experiment and the geometriccalibration procedure the tOumlB and tSumlE matrices werecomputed Equations 5 and 6 were then used to predictpoint of gaze for each of the 100 laserndashtarget alignmentsand mean gaze point error (mean distance along the screenbetween calibration target and point of gaze found withthe LoPG-tracking algorithm) was computed For thedata set from the first experiment mean error was 150 in(Figure 6D) Error can also be described as the angle be-tween the screen target and the computed point of gazewhen viewed from the location of the magnetic sensorMeasured in this way mean angular error was 155ordm Forreasons described in the Discussion section we will re-port the remainder of our mean error results in inches Atour screenndasheye distances 1ordm corresponds approximatelyto 1 in of error

To examine the effect of distortion compensation onmean gaze point error we also performed the geometriccalibration procedure using raw magnetic tracker mea-surements not processed by the distortion correctionpolynomials As seen in Figure 6B without distortioncompensation mean gaze point error was significantlylarger than with compensation (Wilcoxon signed rankstest Z 366 p 001) at 193 in when calibratedusing data from the same experiment (an increase of 29)Without distortion compensation clear outliers from theclusters of points of gaze can be seen for most of thescreen targets in Figure 6B These outlying points ofgaze were all from the tripod location most distant fromthe magnetic transmitter where distortion was largestDistortion compensation provided most benefit for thistripod location

To examine the effect of the geometric calibration pro-cedure on accuracy of predicted LoPG we also com-puted mean gaze point error across the collected data setusing only a hand-measured estimate of tOumlB and tSumlEDistortion correction was included but no geometriccalibration was performed As one might expect without

TRACKING THE LINE OF PRIMARY GAZE 765

calibration errors were larger and this difference wassignificant (Wilcoxon signed ranks test Z 809 p 001) Mean gaze point error using an uncalibrated modelwas 396 in as can be seen in Figure 6C

We also examined the ability of the geometric model tocompute points of gaze without both distortion compensa-tion and geometric calibration The hand-measured tOumlBand tSumlE and raw magnetic tracker readings were used inEquations 5 and 6 Mean gaze point error was 566 in (Fig-ure 6A) The addition of either geometric calibration ordistortion compensation yielded a significant reduction inerror (Wilcoxon signed ranks test Z 61 p 001)

Conducting the same analysis for the data from thesecond experiment we found mean gaze point error tobe lower than that in the first experiment when geomet-ric calibration was performed When both distortioncompensation and geometric calibration were used meanerror was 072 in When geometric calibration was per-formed on data not corrected for distortion mean error was106 in (Wilcoxon signed rank test Z 520 p 001)Reasons for the reduced errors in the second experiment

are proposed in the Discussion section Without geometriccalibration error was much larger (Wilcoxon signed ranktest Z 81 p 001) since we were unable to measuretSumlE with the same accuracy for the second experimentWith and without distortion correction mean errors were398 and 426 in respectively

Effect of sensor location on error Using the rawdata (not compensated for distortion) from the first ex-periment and computing point of gaze without perform-ing geometric calibration the farther the sensor wasfrom the transmitter the worse the error (Spearman cor-relation r 46 p 001) and that error was manifestas an increasing error in the y-dimension (Spearman cor-relation r 50 p 001 Figure 7A) When both thedistortion correction and the geometric calibration wereimplemented the errors were more evenly distributed(Figure 7B) resulting in no significant correlation withdistance from the transmitter (Spearman correlation r 04 p 71) There was however a significant tendencyfor the errors to increase toward the screen (ie in thex-dimension Spearman correlation r 22 p 003)

Figure 6 Effects of distortion compensation (panels C and D) and geometric calibration (panels B and D) oncomputation of points of gaze Data were collected with a tripod-mounted laser pointer used to simulate line ofprimary gaze (LoPG) Circular marks indicate screen location of calibration targets Crosses are laser screen in-tersections (points of gaze) computed by LoPG-tracking algorithm Within each cluster each cross represents ameasurement from a different tripod location Rectangular outline shows location of projection screen Screenoutline and targets appear at hand-measured location (panels A and C) and location found by geometric cali-bration (panels B and D)

766 BARABAS ET AL

This may be because for tripod locations closer to thescreen an angular error (due to distortion for example)of the same size results in a larger error distance alongthe screen

Short calibration In order to use our LoPG-trackingsystem as a tool for gathering data from untrained studyparticipants it is advantageous to calibrate the systemusing as few data points as necessary (eg to reduce fa-tigue) To find the number of calibration data pointsneeded to accurately predict point of gaze we createdcalibration sets from subsets of the 100 measurementscollected in each experiment We used these small cali-bration sets to compute tOumlB and tSumlE and then usedthose computed parameters to calculate screen intersec-tions across the full set of 100 measurements We thencompared these computed intersections with the loca-tions of the actual locations of the on-screen targets Weperformed this technique for calibration sets of varyingsizes across 100 random orderings of the 100 data pointscollected in each experiment For the first experimentwe found that after about 20 points the geometric cali-

bration system converges to a point of gaze with a meangaze point error of less than 2 in with only small im-provements attained by including additional points in thecalibration set (Figure 8)

For the second experiment overall gaze point errorwas lower when calibration was performed with the en-tire data set For this data set 2 in of error could bereached with about 12 data points (Figure 9) Calibra-tions performed with more than 15 data points providedonly a small additional benefit

In the processing of calibration data from both exper-iments geometric calibration occasionally failed com-pletely producing larger mean gaze point error than ifgeometric calibration had not been performed at allThese failures occurred with very small calibration setswhen distortion-related errors provided inconsistent con-straints for the model Failures also occurred when setsincluded targets from only one part of the screen insuf-ficiently constraining the model Fitting failures neveroccurred with sets of 10 points or more and never occurredwhen distortion compensation was performed In all the

Figure 7 The mean gaze point error at each of the 10 tripod locations on the left using the raw data from themagnetic tracker collected in the first experiment (see Figure 6A) and on the right using the same data correctedusing both the distortion compensation and the geometric calibration (Figure 6D) The tripod and laser pointerwere directed toward the projection screen which is shown as the tall white rectangle on the right of each panelThe tracker transmitter is shown as the black square Each circle represents one of the tripod locations with ra-dius equal to the mean gaze point error from that location Mean gaze point error for each location is also shownwith the corresponding circle

TRACKING THE LINE OF PRIMARY GAZE 767

cases in which calibration failed the addition of one ortwo calibration points to the set was enough to reversethe failure In general better fits were obtained from cal-ibration sets with greater variety in sampled tracker ori-entations and screen locations

DISCUSSION

The LoPG-tracking model and calibration techniquespresented here succeed at accurately predicting the on-screen point of gaze while participants look through ahead-mounted sight The validation experiments showthat in agreement with Caldwell et al (2000) the use ofpolynomial distortion compensation can improve the ac-curacy of gaze point calculations (Figures 6C and 6D)In addition benefit was also seen from geometric cali-bration The techniques presented for tracking LoPGprovide a foundation for future gaze-tracking systems

An interesting result of performing calibration bothwith and without distortion compensation was that al-though point of gaze was predicted more accurately withdistortion compensation the tracking system still per-formed relatively well in the presence of uncompensateddistortion provided that geometric calibration was per-formed In the presence of measurement errors causedby distortion the minimization procedure gives modelparameters that may not correspond well to the actualspatial relationships being modeled but interact with the

measurement distortion to produce reasonably accuratescreen intersection predictions If somewhat larger LoPG-tracking errors can be tolerated one might be able toachieve reasonable results by omitting distortion compen-sation altogether Omission of distortion compensationwould likely result in much greater error however for headpositions or orientations outside the range of calibration

Possible sources of gaze point errors remaining whenboth geometric calibration and distortion compensationwere performed include measurement noise inaccura-cies in magnetic tracker measurements not corrected bythe distortion compensation system inaccuracy of themodel parameters found by the geometric calibrationsystem or some combination of these Tests of stabilityin the tracker measurements showed that readings fromthe tracker taken milliseconds or hours apart agreedboth differ from reference measurements by only a fewhundredths of an inch and by a few tenths of a degreeThis small measurement noise is clearly not enough toexplain the gaze point errors which are orders of mag-nitude greater

Our method makes the simplifying assumption thatthe distortions in magnetic tracker orientation readingsdepend on location of the magnetic sensor only and noton the interaction between sensor location and orienta-tion Therefore it is likely that our method did not com-pletely correct for the magnetic distortion When themagnetic sensor is kept in a similar orientation in both

mean across all orderings95 confidence interval

14

12

10

8

6

4

2

02 10 20 30 40 50

Number of data points in calibration set

Mea

n ga

ze p

oint

err

or a

cros

s al

l 100

dat

a po

ints

(in

ches

)

Figure 8 Error in computed point of gaze derived when model was calibrated with in-creasing numbers of calibration points across 100 random orderings of the data set collectedin the first experiment (using a laser pointer) Mean gaze point error was reduced to within2 in (about 2ordm) after about 23 directed gazes

768 BARABAS ET AL

distortion compensation calibration and actual experi-mental use this approach is more likely to produce goodresults In the first experiment our method provided animprovement over omitting orientation distortion com-pensation entirely since sensor alignment was similar incalibration and experiment It is likely however that dis-tortion still plays a role in gaze point errors In the sec-ond experiment we found even lower (than that reportedin the Results section) mean gaze point error when mag-netic tracker measurements were corrected for locationdistortion but not for orientation distortion As the mag-netic tracker was rotated on its side when attached to theheadgear of the head-mounted sight (Figure 2) it was nolonger in the relative orientation needed to see a benefitfrom our compensation for orientation distortion Ex-panding the distortion compensation calibration proce-dure to measure the sensor at multiple orientations foreach grid location sampled could further reduce errordue to distortion This change would increase both thenumber of data points required and the time needed toperform the initial data collection procedure but it wouldnot impact per-participant calibration demands

The geometric calibration algorithm seeks a tOumlB andtSumlE yielding a local minimum in gaze point error start-ing from the initial guess derived from hand measure-ments of these parameters Control experiments wereperformed to verify that no other potentially lower min-ima could be found near the minima arrived at by cali-

bration Although the minimization procedure failed toconverge for intentionally inconsistent initial guessestests conducted using parameters displaced from hand-measured values by several inches in all directions allstill converged on the same LoPG Thus uncorrecteddistortion in magnetic tracker measurements was mostlikely the leading cause of gaze point error

Comparison between points of gaze computed with(Figure 6D) and without (Figure 6C) the geometric cali-bration step shows that attempts to hand measure the spa-tial relationships tOumlB and tSumlE (ie without geometriccalibration) lead to relatively large gaze point errors Ap-plication of the geometric fitting procedure substantiallyimproves the ability of the model to compute LoPG Al-though it is not surprising that better results can be ob-tained when calibration is performed it is interesting tonote that the parameters resulting from calibration deviatedconsiderably from hand-measured parameters In factthe optimized parameters comprising tSumlE indicatedspatial relationships that were clearly inconsistent withthe actual configuration of the laser and magnetic sensorFor example parameters producing the least mean gazepoint error contained values for the distance between themagnetic sensor and the LoPG that were several inchesgreater than physical measurements indicated Althoughthere is some imprecision (we estimate about a quarter ofan inch) in measuring straight-line distance between eyeand magnetic sensor this imprecision alone cannot ex-

mean across all orderings95 confidence interval

14

12

10

8

6

4

2

02 10 20 30 40 50

Number of data points in calibration set

Mea

n ga

ze p

oint

err

or a

cros

s al

l 100

dat

a po

ints

(in

ches

)

Figure 9 Error in computed point of gaze derived when model was calibrated with in-creasing numbers of calibration points across 100 random orderings of the data set collectedin the second experiment (with a human participant) Mean gaze point error was reduced toless than 2 in (about 2ordm) in about 12 directed gazes

TRACKING THE LINE OF PRIMARY GAZE 769

plain the difference Since calibration performed withdata from the human participant seemed less prone tothis inconsistency we suspect that the constraints im-posed on the movement of the laser and sensor by the tri-pod head may be responsible In positioning the laser toalign it with calibration targets the sensor remainedwithin 47ordm of level and ldquorollrdquo of the sensor could not beperformed These constraints lead to less variation in thecalibration data and in turn may have underconstrainedthe fitting procedure used in geometric calibration

A limitation of the geometric calibration procedure de-scribed here is the inability of the calibration procedureto locate the eye along the LoPG Although this limita-tion does not impact the ability of the system to predictpoints of gaze on the screen it does impact the veracityof the model parameters Although the geometric modelused specifies the eye as a coordinate frame with an ori-gin located some distance from the magnetic tracker sen-sor the geometric calibration procedure constrains onlythe x-axis of this coordinate frame This results in an in-finite number of possible sets of model parameters thatstill predict the same point of gaze on the screen (iescreen intersection remains the same even if the eye wereto slide forward or back along the LoPG or to rotate aboutits x-axis) This means that the parameters making uptSumlE found by the fitting procedure correspond to onepossible location of the eye along the LoPG Control ex-periments showed that the fitted location of the eye alongthe LoPG depended on the initial guess used for the fit-ting procedure even though these alternate initialguesses still yielded the same points of gaze5 If the in-termediate results of the model are to be used for exam-ple for computing the location of the eye relative to thescreen for placement of a camera in a virtual environ-ment the inability of this system to precisely locate theeye might be overcome by performing an additional cal-ibration For example a calibration similar to the oneused in this article but requiring the participant to placethe eye in other nonprimary positions of gaze could beused to find actual eye location This technique is cur-rently being investigated

REFERENCES

Allison R S Eizenman M amp Cheung B S K (1996) Combinedhead and eye tracking system for dynamic testing of the vestibularsystem IEEE Transactions on Biomedical Engineering 43 1073-1082

Asai K Osawa N Takahashi H amp Sugimoto Y Y (2000 March)Eye mark pointer in immersive projection display Paper presented atthe IEEE Virtual Reality 2000 Conference New Brunswick NJ

Atchison D A amp Smith G (2000) Optics of the human eye OxfordButterworth-Heinemann

Barabas J Giorgi R G Goldstein R B Apfelbaum H WoodsR L amp Peli E (2003 May) Wide field 3D gaze tracking systemPaper presented at the Association for Research in Vision and Oph-thalmology Meeting Fort Lauderdale FL

Blood E (1990) US Patent No 4945305 Washington DC USPatent amp Trademark Office

Bryson S (1992 February) Measurement and calibration of staticdistortion of position data from 3D trackers Paper presented at SPIEStereoscopic Displays amp Applications III San Jose

Caldwell R T Berbaum K S amp Borah J (2000) Correcting er-rors in eye-position data arising from the distortion of magnetic fieldsby display devices Behavior Research Methods Instruments ampComputers 32 572-578

Day J S Dumas G A amp Murdoch D J (1998) Evaluation of along-range transmitter for use with a magnetic tracking device in mo-tion analysis Journal of Biomechanics 31 957-961

Duchowski A Medlin E Cournia N Murphy H Gramopad-hye A Nair S Vorah J amp Melloy B (2002) 3-D eye move-ment analysis Behavior Research Methods Instruments amp Comput-ers 34 573-591

Ikits M Brederson J D Hansen C D amp Hollerbach J M(2001 March) An improved calibration framework for electromag-netic tracking devices Paper presented at the Virtual Reality 2001Proceedings IEEE Yokohama

Janin A Mizell D amp Caudell T (1993 September) Calibrationof head-mounted displays for augmented reality applications Paperpresented at the Virtual Reality Annual International SymposiumSeattle

Kindratenko V (1999) Calibration of electromagnetic tracking de-vices Virtual Reality Research Development amp Applications 4139-150

Kindratenko V (2000) A survey of electromagnetic position trackercalibration techniques Virtual Reality Research Development ampApplications 5 169-182

King P (1995) Integration of helmet-mounted displays into tacticalaircraft Proceedings of the Society for Information Display 26 663-668

Kuipers J B (1999) Quaternions and rotation sequences A primerwith applications to orbits aerospace and virtual reality PrincetonNJ Princeton University Press

Leigh R J amp Zee D S (1983) The neurology of eye movementsPhiladelphia Davis

Livingston M A amp State A (1997) Magnetic tracker calibrationfor improved augmented reality registration Presence Teleoperatorsamp Virtual Environments 6 532-546

Nixon M A McCallum B C Fright W R amp Price N B (1998)The effects of metals and interfering fields on electromagnetic track-ers Presence 7 204-218

Southard D A (1995) Viewing model for virtual environment dis-plays Journal of Electronic Imaging 4 413-420

Young L R amp Sheena D (1975) Survey of eye movement recordingmethods Behavior Research Methods amp Instrumentation 7 397-429

NOTES

1 This position is used in projected virtual reality displays to deter-mine how to position the virtual camera when generating an image of ascene

2 Even if the tracker is not aligned exactly with the case of the trans-mitter the correction polynomials (Equations 1 and 2) we used includeterms to account for such an offset These offsets are contained in theconstant (i 0) and linear (i 1) terms of polynomials

3 Two linear systems were constructed from 192 six-variable mea-surements Each of the two systems contained 192 3 576 knownsand 105 unknowns

4 It should be noted that the calibration procedure described heredoes not completely constrain the model parameters used Since no mea-surement of orientation on the screen of the point of gaze or distancefrom eye to screen during a sighting is made neither of these can be pre-dicted by the model As a result the best-fitting model parameters de-scribing tSumlE may converge on any one of a line of solutions corre-sponding to possible locations of the eye along the LoPG Each of theseeye locations predicts the same LoPG location on the screen It shouldbe possible to compute correct eye position with a calibration proce-dure that includes eye-in-head rotation

5 For this reason we have chosen to report errors in inches along thescreen surface since angular measures of error are more sensitive to lo-cation of the eye along the LoPG Where angular error is reported weestimated the eyendashscreen distance to be the distance between the mag-netic sensor and the screen

770 BARABAS ET AL

ARCHIVED MATERIALS

The following materials associated with this article are retrievablefrom the Psychonomic Societyrsquos Norms Stimuli and Data archivehttpwwwpsychonomicorgarchive

To access the above files or links search the archive for this articleusing the journal (Behavior Research Methods Instruments amp Com-puters) the first authorrsquos name (Barabas) and the publication year(2004)

FILE Barabas-BRMIC-2004zipDESCRIPTION The compressed archive file contains seven files The

files areREADMEtxt containing a description of scripts and data included

in the archive (a 3K text file) This file can renamed to readmem andexecuted as a Matlab script to reproduce some of the results reported inthis manuscript

barabas2004LoPGmat a Matlab workspace file containing mag-netic tracker data collected in the verification experiments described inthis manuscript (a 32K binary file)

findcorrectioncoeffsm a Matlab function for finding the distortioncompensation coefficients for Equations 1 and 2 (a 5K Matlab M-file)

correctdistortionm a Matlab function for performing distortioncompensation on magnetic tracker data using Equations 1 and 2 (a 4KMatlab M-file)

findmodelparamsm a Matlab function for finding the parameters ofthe LoPG model from data collected during a series of directed gazes (a3K Matlab M-file)

intersecterrorm a Matlab function for finding gaze point error for aset of directed gazes (a 5K Matlab M-file)

trackereulerstomtx4m a Matlab function that computes a transfor-mation matrix representation of a relationship between coordinateframes via Equation 4 (a 3K Matlab M-file)

AUTHORrsquoS WEB SITE httpwwweriharvardedufacultypeliindexhtml

AUTHORrsquoS E-MAIL ADDRESS barabasmitedu

(Manuscript received July 18 2003revision accepted for publication July 23 2004)

764 BARABAS ET AL

compensation system as described above and then per-formed two target-sighting experiments In the first ex-periment we performed the geometric calibration pro-cedure using the beam of a laser pointer to represent theline of gaze of the human participant Doing so allowedthe simulated LoPG to be precisely aligned with calibra-tion targets projected on the screen and allowed testingwithout the variability of human participantsrsquo ability tomaintain head position To fix the relative placement ofthe laser and sensor both were attached to a small woodenblock The sensor was placed on the block with its x-axisapproximately parallel to and 2 in above the laser beamThis alignment enabled us to easily measure the actualspatial relationship between the sensor and the laser forvalidation of the parameters found by the geometric cali-bration procedure (We found it much more difficult tomeasure the equivalent relationship tSumlE on a humanparticipant) The wood block holding the sensor and laserwas fastened to an aluminum and plastic tripod raised tothe approximate height of a human participantrsquos head Togather data for testing the geometric calibration proce-dure the tripod was placed at 10 different locations Theselocations were chosen within the range of possible stand-ing positions of our study participants (an area about 36 36 in) For each of the 10 tripod locations readings fromthe magnetic tracker were taken panning and tilting thetripod head so that the laser beam fell on each calibrationtarget Four of the targets were just inside the corners ofthe screen one was placed in the center and five other lo-cations were arbitrarily chosen The targets remained inthe same locations throughout each experiment Due tothe distance between the sensor and the pivot point of thetripod head for a given tripod location the actual magneticsensor locations varied by up to 8 in as the tripod headwas tilted at different angles to align the laser

In a second experiment a similar set of 100 directedgazes was performed by a human participant The par-ticipant used the head-mounted sight (Figure 2) to alignhis LoPG with each of the 10 targets from each of 10standing locations The 5 calibration targets that were ar-bitrarily placed in the first experiment were in differentlocations for calibration of the human participant

Data from both experiments were independently usedto calibrate the LoPG-tracking system and the benefitof distortion compensation and geometric calibrationprocedures were examined For the distortion compen-sation step we used polynomial coefficients computedfrom the 192 pegboard locations as described above Inall cases we used as an initial guess for the geometriccalibration procedure a set of hand-measured parametersthat approximated the actual geometry of the system

RESULTS

For each of the two experiments collected data con-sisted of a set of 100 six-variable tracker readings (eachreading taken when the laserLoPG was aligned to fall atthe center of an on-screen calibration target) as well as

the 100 two-dimensional screen locations of the corre-sponding calibration targets (10 repetitions of 10 tar-gets) Location and orientation data from the magnetictracker were passed through the distortion compensationfunctions (Equations 1 and 2) yielding a second set of100 corrected tracker readings for each experiment Weused subsets of this body of distortion-corrected data toderive the transformation matrices tOumlB and tSumlE usingthe geometric calibration procedure described aboveand tested these fits against the remaining data We com-pared known screenndashtarget locations to point-of-gazecomputations made with and without distortion com-pensation or geometric calibration We also investigatedthe behavior of the error as a function of the number ofincluded calibration points by varying the size of the cal-ibration data sets used to compute tOumlB and tSumlE

Accuracy of Predicted Point of GazeUsing the distortion-corrected location and orientation

measurements from the entire set of 100 laserndashtargetalignments from the first experiment and the geometriccalibration procedure the tOumlB and tSumlE matrices werecomputed Equations 5 and 6 were then used to predictpoint of gaze for each of the 100 laserndashtarget alignmentsand mean gaze point error (mean distance along the screenbetween calibration target and point of gaze found withthe LoPG-tracking algorithm) was computed For thedata set from the first experiment mean error was 150 in(Figure 6D) Error can also be described as the angle be-tween the screen target and the computed point of gazewhen viewed from the location of the magnetic sensorMeasured in this way mean angular error was 155ordm Forreasons described in the Discussion section we will re-port the remainder of our mean error results in inches Atour screenndasheye distances 1ordm corresponds approximatelyto 1 in of error

To examine the effect of distortion compensation onmean gaze point error we also performed the geometriccalibration procedure using raw magnetic tracker mea-surements not processed by the distortion correctionpolynomials As seen in Figure 6B without distortioncompensation mean gaze point error was significantlylarger than with compensation (Wilcoxon signed rankstest Z 366 p 001) at 193 in when calibratedusing data from the same experiment (an increase of 29)Without distortion compensation clear outliers from theclusters of points of gaze can be seen for most of thescreen targets in Figure 6B These outlying points ofgaze were all from the tripod location most distant fromthe magnetic transmitter where distortion was largestDistortion compensation provided most benefit for thistripod location

To examine the effect of the geometric calibration pro-cedure on accuracy of predicted LoPG we also com-puted mean gaze point error across the collected data setusing only a hand-measured estimate of tOumlB and tSumlEDistortion correction was included but no geometriccalibration was performed As one might expect without

TRACKING THE LINE OF PRIMARY GAZE 765

calibration errors were larger and this difference wassignificant (Wilcoxon signed ranks test Z 809 p 001) Mean gaze point error using an uncalibrated modelwas 396 in as can be seen in Figure 6C

We also examined the ability of the geometric model tocompute points of gaze without both distortion compensa-tion and geometric calibration The hand-measured tOumlBand tSumlE and raw magnetic tracker readings were used inEquations 5 and 6 Mean gaze point error was 566 in (Fig-ure 6A) The addition of either geometric calibration ordistortion compensation yielded a significant reduction inerror (Wilcoxon signed ranks test Z 61 p 001)

Conducting the same analysis for the data from thesecond experiment we found mean gaze point error tobe lower than that in the first experiment when geomet-ric calibration was performed When both distortioncompensation and geometric calibration were used meanerror was 072 in When geometric calibration was per-formed on data not corrected for distortion mean error was106 in (Wilcoxon signed rank test Z 520 p 001)Reasons for the reduced errors in the second experiment

are proposed in the Discussion section Without geometriccalibration error was much larger (Wilcoxon signed ranktest Z 81 p 001) since we were unable to measuretSumlE with the same accuracy for the second experimentWith and without distortion correction mean errors were398 and 426 in respectively

Effect of sensor location on error Using the rawdata (not compensated for distortion) from the first ex-periment and computing point of gaze without perform-ing geometric calibration the farther the sensor wasfrom the transmitter the worse the error (Spearman cor-relation r 46 p 001) and that error was manifestas an increasing error in the y-dimension (Spearman cor-relation r 50 p 001 Figure 7A) When both thedistortion correction and the geometric calibration wereimplemented the errors were more evenly distributed(Figure 7B) resulting in no significant correlation withdistance from the transmitter (Spearman correlation r 04 p 71) There was however a significant tendencyfor the errors to increase toward the screen (ie in thex-dimension Spearman correlation r 22 p 003)

Figure 6 Effects of distortion compensation (panels C and D) and geometric calibration (panels B and D) oncomputation of points of gaze Data were collected with a tripod-mounted laser pointer used to simulate line ofprimary gaze (LoPG) Circular marks indicate screen location of calibration targets Crosses are laser screen in-tersections (points of gaze) computed by LoPG-tracking algorithm Within each cluster each cross represents ameasurement from a different tripod location Rectangular outline shows location of projection screen Screenoutline and targets appear at hand-measured location (panels A and C) and location found by geometric cali-bration (panels B and D)

766 BARABAS ET AL

This may be because for tripod locations closer to thescreen an angular error (due to distortion for example)of the same size results in a larger error distance alongthe screen

Short calibration In order to use our LoPG-trackingsystem as a tool for gathering data from untrained studyparticipants it is advantageous to calibrate the systemusing as few data points as necessary (eg to reduce fa-tigue) To find the number of calibration data pointsneeded to accurately predict point of gaze we createdcalibration sets from subsets of the 100 measurementscollected in each experiment We used these small cali-bration sets to compute tOumlB and tSumlE and then usedthose computed parameters to calculate screen intersec-tions across the full set of 100 measurements We thencompared these computed intersections with the loca-tions of the actual locations of the on-screen targets Weperformed this technique for calibration sets of varyingsizes across 100 random orderings of the 100 data pointscollected in each experiment For the first experimentwe found that after about 20 points the geometric cali-

bration system converges to a point of gaze with a meangaze point error of less than 2 in with only small im-provements attained by including additional points in thecalibration set (Figure 8)

For the second experiment overall gaze point errorwas lower when calibration was performed with the en-tire data set For this data set 2 in of error could bereached with about 12 data points (Figure 9) Calibra-tions performed with more than 15 data points providedonly a small additional benefit

In the processing of calibration data from both exper-iments geometric calibration occasionally failed com-pletely producing larger mean gaze point error than ifgeometric calibration had not been performed at allThese failures occurred with very small calibration setswhen distortion-related errors provided inconsistent con-straints for the model Failures also occurred when setsincluded targets from only one part of the screen insuf-ficiently constraining the model Fitting failures neveroccurred with sets of 10 points or more and never occurredwhen distortion compensation was performed In all the

Figure 7 The mean gaze point error at each of the 10 tripod locations on the left using the raw data from themagnetic tracker collected in the first experiment (see Figure 6A) and on the right using the same data correctedusing both the distortion compensation and the geometric calibration (Figure 6D) The tripod and laser pointerwere directed toward the projection screen which is shown as the tall white rectangle on the right of each panelThe tracker transmitter is shown as the black square Each circle represents one of the tripod locations with ra-dius equal to the mean gaze point error from that location Mean gaze point error for each location is also shownwith the corresponding circle

TRACKING THE LINE OF PRIMARY GAZE 767

cases in which calibration failed the addition of one ortwo calibration points to the set was enough to reversethe failure In general better fits were obtained from cal-ibration sets with greater variety in sampled tracker ori-entations and screen locations

DISCUSSION

The LoPG-tracking model and calibration techniquespresented here succeed at accurately predicting the on-screen point of gaze while participants look through ahead-mounted sight The validation experiments showthat in agreement with Caldwell et al (2000) the use ofpolynomial distortion compensation can improve the ac-curacy of gaze point calculations (Figures 6C and 6D)In addition benefit was also seen from geometric cali-bration The techniques presented for tracking LoPGprovide a foundation for future gaze-tracking systems

An interesting result of performing calibration bothwith and without distortion compensation was that al-though point of gaze was predicted more accurately withdistortion compensation the tracking system still per-formed relatively well in the presence of uncompensateddistortion provided that geometric calibration was per-formed In the presence of measurement errors causedby distortion the minimization procedure gives modelparameters that may not correspond well to the actualspatial relationships being modeled but interact with the

measurement distortion to produce reasonably accuratescreen intersection predictions If somewhat larger LoPG-tracking errors can be tolerated one might be able toachieve reasonable results by omitting distortion compen-sation altogether Omission of distortion compensationwould likely result in much greater error however for headpositions or orientations outside the range of calibration

Possible sources of gaze point errors remaining whenboth geometric calibration and distortion compensationwere performed include measurement noise inaccura-cies in magnetic tracker measurements not corrected bythe distortion compensation system inaccuracy of themodel parameters found by the geometric calibrationsystem or some combination of these Tests of stabilityin the tracker measurements showed that readings fromthe tracker taken milliseconds or hours apart agreedboth differ from reference measurements by only a fewhundredths of an inch and by a few tenths of a degreeThis small measurement noise is clearly not enough toexplain the gaze point errors which are orders of mag-nitude greater

Our method makes the simplifying assumption thatthe distortions in magnetic tracker orientation readingsdepend on location of the magnetic sensor only and noton the interaction between sensor location and orienta-tion Therefore it is likely that our method did not com-pletely correct for the magnetic distortion When themagnetic sensor is kept in a similar orientation in both

mean across all orderings95 confidence interval

14

12

10

8

6

4

2

02 10 20 30 40 50

Number of data points in calibration set

Mea

n ga

ze p

oint

err

or a

cros

s al

l 100

dat

a po

ints

(in

ches

)

Figure 8 Error in computed point of gaze derived when model was calibrated with in-creasing numbers of calibration points across 100 random orderings of the data set collectedin the first experiment (using a laser pointer) Mean gaze point error was reduced to within2 in (about 2ordm) after about 23 directed gazes

768 BARABAS ET AL

distortion compensation calibration and actual experi-mental use this approach is more likely to produce goodresults In the first experiment our method provided animprovement over omitting orientation distortion com-pensation entirely since sensor alignment was similar incalibration and experiment It is likely however that dis-tortion still plays a role in gaze point errors In the sec-ond experiment we found even lower (than that reportedin the Results section) mean gaze point error when mag-netic tracker measurements were corrected for locationdistortion but not for orientation distortion As the mag-netic tracker was rotated on its side when attached to theheadgear of the head-mounted sight (Figure 2) it was nolonger in the relative orientation needed to see a benefitfrom our compensation for orientation distortion Ex-panding the distortion compensation calibration proce-dure to measure the sensor at multiple orientations foreach grid location sampled could further reduce errordue to distortion This change would increase both thenumber of data points required and the time needed toperform the initial data collection procedure but it wouldnot impact per-participant calibration demands

The geometric calibration algorithm seeks a tOumlB andtSumlE yielding a local minimum in gaze point error start-ing from the initial guess derived from hand measure-ments of these parameters Control experiments wereperformed to verify that no other potentially lower min-ima could be found near the minima arrived at by cali-

bration Although the minimization procedure failed toconverge for intentionally inconsistent initial guessestests conducted using parameters displaced from hand-measured values by several inches in all directions allstill converged on the same LoPG Thus uncorrecteddistortion in magnetic tracker measurements was mostlikely the leading cause of gaze point error

Comparison between points of gaze computed with(Figure 6D) and without (Figure 6C) the geometric cali-bration step shows that attempts to hand measure the spa-tial relationships tOumlB and tSumlE (ie without geometriccalibration) lead to relatively large gaze point errors Ap-plication of the geometric fitting procedure substantiallyimproves the ability of the model to compute LoPG Al-though it is not surprising that better results can be ob-tained when calibration is performed it is interesting tonote that the parameters resulting from calibration deviatedconsiderably from hand-measured parameters In factthe optimized parameters comprising tSumlE indicatedspatial relationships that were clearly inconsistent withthe actual configuration of the laser and magnetic sensorFor example parameters producing the least mean gazepoint error contained values for the distance between themagnetic sensor and the LoPG that were several inchesgreater than physical measurements indicated Althoughthere is some imprecision (we estimate about a quarter ofan inch) in measuring straight-line distance between eyeand magnetic sensor this imprecision alone cannot ex-

mean across all orderings95 confidence interval

14

12

10

8

6

4

2

02 10 20 30 40 50

Number of data points in calibration set

Mea

n ga

ze p

oint

err

or a

cros

s al

l 100

dat

a po

ints

(in

ches

)

Figure 9 Error in computed point of gaze derived when model was calibrated with in-creasing numbers of calibration points across 100 random orderings of the data set collectedin the second experiment (with a human participant) Mean gaze point error was reduced toless than 2 in (about 2ordm) in about 12 directed gazes

TRACKING THE LINE OF PRIMARY GAZE 769

plain the difference Since calibration performed withdata from the human participant seemed less prone tothis inconsistency we suspect that the constraints im-posed on the movement of the laser and sensor by the tri-pod head may be responsible In positioning the laser toalign it with calibration targets the sensor remainedwithin 47ordm of level and ldquorollrdquo of the sensor could not beperformed These constraints lead to less variation in thecalibration data and in turn may have underconstrainedthe fitting procedure used in geometric calibration

A limitation of the geometric calibration procedure de-scribed here is the inability of the calibration procedureto locate the eye along the LoPG Although this limita-tion does not impact the ability of the system to predictpoints of gaze on the screen it does impact the veracityof the model parameters Although the geometric modelused specifies the eye as a coordinate frame with an ori-gin located some distance from the magnetic tracker sen-sor the geometric calibration procedure constrains onlythe x-axis of this coordinate frame This results in an in-finite number of possible sets of model parameters thatstill predict the same point of gaze on the screen (iescreen intersection remains the same even if the eye wereto slide forward or back along the LoPG or to rotate aboutits x-axis) This means that the parameters making uptSumlE found by the fitting procedure correspond to onepossible location of the eye along the LoPG Control ex-periments showed that the fitted location of the eye alongthe LoPG depended on the initial guess used for the fit-ting procedure even though these alternate initialguesses still yielded the same points of gaze5 If the in-termediate results of the model are to be used for exam-ple for computing the location of the eye relative to thescreen for placement of a camera in a virtual environ-ment the inability of this system to precisely locate theeye might be overcome by performing an additional cal-ibration For example a calibration similar to the oneused in this article but requiring the participant to placethe eye in other nonprimary positions of gaze could beused to find actual eye location This technique is cur-rently being investigated

REFERENCES

Allison R S Eizenman M amp Cheung B S K (1996) Combinedhead and eye tracking system for dynamic testing of the vestibularsystem IEEE Transactions on Biomedical Engineering 43 1073-1082

Asai K Osawa N Takahashi H amp Sugimoto Y Y (2000 March)Eye mark pointer in immersive projection display Paper presented atthe IEEE Virtual Reality 2000 Conference New Brunswick NJ

Atchison D A amp Smith G (2000) Optics of the human eye OxfordButterworth-Heinemann

Barabas J Giorgi R G Goldstein R B Apfelbaum H WoodsR L amp Peli E (2003 May) Wide field 3D gaze tracking systemPaper presented at the Association for Research in Vision and Oph-thalmology Meeting Fort Lauderdale FL

Blood E (1990) US Patent No 4945305 Washington DC USPatent amp Trademark Office

Bryson S (1992 February) Measurement and calibration of staticdistortion of position data from 3D trackers Paper presented at SPIEStereoscopic Displays amp Applications III San Jose

Caldwell R T Berbaum K S amp Borah J (2000) Correcting er-rors in eye-position data arising from the distortion of magnetic fieldsby display devices Behavior Research Methods Instruments ampComputers 32 572-578

Day J S Dumas G A amp Murdoch D J (1998) Evaluation of along-range transmitter for use with a magnetic tracking device in mo-tion analysis Journal of Biomechanics 31 957-961

Duchowski A Medlin E Cournia N Murphy H Gramopad-hye A Nair S Vorah J amp Melloy B (2002) 3-D eye move-ment analysis Behavior Research Methods Instruments amp Comput-ers 34 573-591

Ikits M Brederson J D Hansen C D amp Hollerbach J M(2001 March) An improved calibration framework for electromag-netic tracking devices Paper presented at the Virtual Reality 2001Proceedings IEEE Yokohama

Janin A Mizell D amp Caudell T (1993 September) Calibrationof head-mounted displays for augmented reality applications Paperpresented at the Virtual Reality Annual International SymposiumSeattle

Kindratenko V (1999) Calibration of electromagnetic tracking de-vices Virtual Reality Research Development amp Applications 4139-150

Kindratenko V (2000) A survey of electromagnetic position trackercalibration techniques Virtual Reality Research Development ampApplications 5 169-182

King P (1995) Integration of helmet-mounted displays into tacticalaircraft Proceedings of the Society for Information Display 26 663-668

Kuipers J B (1999) Quaternions and rotation sequences A primerwith applications to orbits aerospace and virtual reality PrincetonNJ Princeton University Press

Leigh R J amp Zee D S (1983) The neurology of eye movementsPhiladelphia Davis

Livingston M A amp State A (1997) Magnetic tracker calibrationfor improved augmented reality registration Presence Teleoperatorsamp Virtual Environments 6 532-546

Nixon M A McCallum B C Fright W R amp Price N B (1998)The effects of metals and interfering fields on electromagnetic track-ers Presence 7 204-218

Southard D A (1995) Viewing model for virtual environment dis-plays Journal of Electronic Imaging 4 413-420

Young L R amp Sheena D (1975) Survey of eye movement recordingmethods Behavior Research Methods amp Instrumentation 7 397-429

NOTES

1 This position is used in projected virtual reality displays to deter-mine how to position the virtual camera when generating an image of ascene

2 Even if the tracker is not aligned exactly with the case of the trans-mitter the correction polynomials (Equations 1 and 2) we used includeterms to account for such an offset These offsets are contained in theconstant (i 0) and linear (i 1) terms of polynomials

3 Two linear systems were constructed from 192 six-variable mea-surements Each of the two systems contained 192 3 576 knownsand 105 unknowns

4 It should be noted that the calibration procedure described heredoes not completely constrain the model parameters used Since no mea-surement of orientation on the screen of the point of gaze or distancefrom eye to screen during a sighting is made neither of these can be pre-dicted by the model As a result the best-fitting model parameters de-scribing tSumlE may converge on any one of a line of solutions corre-sponding to possible locations of the eye along the LoPG Each of theseeye locations predicts the same LoPG location on the screen It shouldbe possible to compute correct eye position with a calibration proce-dure that includes eye-in-head rotation

5 For this reason we have chosen to report errors in inches along thescreen surface since angular measures of error are more sensitive to lo-cation of the eye along the LoPG Where angular error is reported weestimated the eyendashscreen distance to be the distance between the mag-netic sensor and the screen

770 BARABAS ET AL

ARCHIVED MATERIALS

The following materials associated with this article are retrievablefrom the Psychonomic Societyrsquos Norms Stimuli and Data archivehttpwwwpsychonomicorgarchive

To access the above files or links search the archive for this articleusing the journal (Behavior Research Methods Instruments amp Com-puters) the first authorrsquos name (Barabas) and the publication year(2004)

FILE Barabas-BRMIC-2004zipDESCRIPTION The compressed archive file contains seven files The

files areREADMEtxt containing a description of scripts and data included

in the archive (a 3K text file) This file can renamed to readmem andexecuted as a Matlab script to reproduce some of the results reported inthis manuscript

barabas2004LoPGmat a Matlab workspace file containing mag-netic tracker data collected in the verification experiments described inthis manuscript (a 32K binary file)

findcorrectioncoeffsm a Matlab function for finding the distortioncompensation coefficients for Equations 1 and 2 (a 5K Matlab M-file)

correctdistortionm a Matlab function for performing distortioncompensation on magnetic tracker data using Equations 1 and 2 (a 4KMatlab M-file)

findmodelparamsm a Matlab function for finding the parameters ofthe LoPG model from data collected during a series of directed gazes (a3K Matlab M-file)

intersecterrorm a Matlab function for finding gaze point error for aset of directed gazes (a 5K Matlab M-file)

trackereulerstomtx4m a Matlab function that computes a transfor-mation matrix representation of a relationship between coordinateframes via Equation 4 (a 3K Matlab M-file)

AUTHORrsquoS WEB SITE httpwwweriharvardedufacultypeliindexhtml

AUTHORrsquoS E-MAIL ADDRESS barabasmitedu

(Manuscript received July 18 2003revision accepted for publication July 23 2004)

TRACKING THE LINE OF PRIMARY GAZE 765

calibration errors were larger and this difference wassignificant (Wilcoxon signed ranks test Z 809 p 001) Mean gaze point error using an uncalibrated modelwas 396 in as can be seen in Figure 6C

We also examined the ability of the geometric model tocompute points of gaze without both distortion compensa-tion and geometric calibration The hand-measured tOumlBand tSumlE and raw magnetic tracker readings were used inEquations 5 and 6 Mean gaze point error was 566 in (Fig-ure 6A) The addition of either geometric calibration ordistortion compensation yielded a significant reduction inerror (Wilcoxon signed ranks test Z 61 p 001)

Conducting the same analysis for the data from thesecond experiment we found mean gaze point error tobe lower than that in the first experiment when geomet-ric calibration was performed When both distortioncompensation and geometric calibration were used meanerror was 072 in When geometric calibration was per-formed on data not corrected for distortion mean error was106 in (Wilcoxon signed rank test Z 520 p 001)Reasons for the reduced errors in the second experiment

are proposed in the Discussion section Without geometriccalibration error was much larger (Wilcoxon signed ranktest Z 81 p 001) since we were unable to measuretSumlE with the same accuracy for the second experimentWith and without distortion correction mean errors were398 and 426 in respectively

Effect of sensor location on error Using the rawdata (not compensated for distortion) from the first ex-periment and computing point of gaze without perform-ing geometric calibration the farther the sensor wasfrom the transmitter the worse the error (Spearman cor-relation r 46 p 001) and that error was manifestas an increasing error in the y-dimension (Spearman cor-relation r 50 p 001 Figure 7A) When both thedistortion correction and the geometric calibration wereimplemented the errors were more evenly distributed(Figure 7B) resulting in no significant correlation withdistance from the transmitter (Spearman correlation r 04 p 71) There was however a significant tendencyfor the errors to increase toward the screen (ie in thex-dimension Spearman correlation r 22 p 003)

Figure 6 Effects of distortion compensation (panels C and D) and geometric calibration (panels B and D) oncomputation of points of gaze Data were collected with a tripod-mounted laser pointer used to simulate line ofprimary gaze (LoPG) Circular marks indicate screen location of calibration targets Crosses are laser screen in-tersections (points of gaze) computed by LoPG-tracking algorithm Within each cluster each cross represents ameasurement from a different tripod location Rectangular outline shows location of projection screen Screenoutline and targets appear at hand-measured location (panels A and C) and location found by geometric cali-bration (panels B and D)

766 BARABAS ET AL

This may be because for tripod locations closer to thescreen an angular error (due to distortion for example)of the same size results in a larger error distance alongthe screen

Short calibration In order to use our LoPG-trackingsystem as a tool for gathering data from untrained studyparticipants it is advantageous to calibrate the systemusing as few data points as necessary (eg to reduce fa-tigue) To find the number of calibration data pointsneeded to accurately predict point of gaze we createdcalibration sets from subsets of the 100 measurementscollected in each experiment We used these small cali-bration sets to compute tOumlB and tSumlE and then usedthose computed parameters to calculate screen intersec-tions across the full set of 100 measurements We thencompared these computed intersections with the loca-tions of the actual locations of the on-screen targets Weperformed this technique for calibration sets of varyingsizes across 100 random orderings of the 100 data pointscollected in each experiment For the first experimentwe found that after about 20 points the geometric cali-

bration system converges to a point of gaze with a meangaze point error of less than 2 in with only small im-provements attained by including additional points in thecalibration set (Figure 8)

For the second experiment overall gaze point errorwas lower when calibration was performed with the en-tire data set For this data set 2 in of error could bereached with about 12 data points (Figure 9) Calibra-tions performed with more than 15 data points providedonly a small additional benefit

In the processing of calibration data from both exper-iments geometric calibration occasionally failed com-pletely producing larger mean gaze point error than ifgeometric calibration had not been performed at allThese failures occurred with very small calibration setswhen distortion-related errors provided inconsistent con-straints for the model Failures also occurred when setsincluded targets from only one part of the screen insuf-ficiently constraining the model Fitting failures neveroccurred with sets of 10 points or more and never occurredwhen distortion compensation was performed In all the

Figure 7 The mean gaze point error at each of the 10 tripod locations on the left using the raw data from themagnetic tracker collected in the first experiment (see Figure 6A) and on the right using the same data correctedusing both the distortion compensation and the geometric calibration (Figure 6D) The tripod and laser pointerwere directed toward the projection screen which is shown as the tall white rectangle on the right of each panelThe tracker transmitter is shown as the black square Each circle represents one of the tripod locations with ra-dius equal to the mean gaze point error from that location Mean gaze point error for each location is also shownwith the corresponding circle

TRACKING THE LINE OF PRIMARY GAZE 767

cases in which calibration failed the addition of one ortwo calibration points to the set was enough to reversethe failure In general better fits were obtained from cal-ibration sets with greater variety in sampled tracker ori-entations and screen locations

DISCUSSION

The LoPG-tracking model and calibration techniquespresented here succeed at accurately predicting the on-screen point of gaze while participants look through ahead-mounted sight The validation experiments showthat in agreement with Caldwell et al (2000) the use ofpolynomial distortion compensation can improve the ac-curacy of gaze point calculations (Figures 6C and 6D)In addition benefit was also seen from geometric cali-bration The techniques presented for tracking LoPGprovide a foundation for future gaze-tracking systems

An interesting result of performing calibration bothwith and without distortion compensation was that al-though point of gaze was predicted more accurately withdistortion compensation the tracking system still per-formed relatively well in the presence of uncompensateddistortion provided that geometric calibration was per-formed In the presence of measurement errors causedby distortion the minimization procedure gives modelparameters that may not correspond well to the actualspatial relationships being modeled but interact with the

measurement distortion to produce reasonably accuratescreen intersection predictions If somewhat larger LoPG-tracking errors can be tolerated one might be able toachieve reasonable results by omitting distortion compen-sation altogether Omission of distortion compensationwould likely result in much greater error however for headpositions or orientations outside the range of calibration

Possible sources of gaze point errors remaining whenboth geometric calibration and distortion compensationwere performed include measurement noise inaccura-cies in magnetic tracker measurements not corrected bythe distortion compensation system inaccuracy of themodel parameters found by the geometric calibrationsystem or some combination of these Tests of stabilityin the tracker measurements showed that readings fromthe tracker taken milliseconds or hours apart agreedboth differ from reference measurements by only a fewhundredths of an inch and by a few tenths of a degreeThis small measurement noise is clearly not enough toexplain the gaze point errors which are orders of mag-nitude greater

Our method makes the simplifying assumption thatthe distortions in magnetic tracker orientation readingsdepend on location of the magnetic sensor only and noton the interaction between sensor location and orienta-tion Therefore it is likely that our method did not com-pletely correct for the magnetic distortion When themagnetic sensor is kept in a similar orientation in both

mean across all orderings95 confidence interval

14

12

10

8

6

4

2

02 10 20 30 40 50

Number of data points in calibration set

Mea

n ga

ze p

oint

err

or a

cros

s al

l 100

dat

a po

ints

(in

ches

)

Figure 8 Error in computed point of gaze derived when model was calibrated with in-creasing numbers of calibration points across 100 random orderings of the data set collectedin the first experiment (using a laser pointer) Mean gaze point error was reduced to within2 in (about 2ordm) after about 23 directed gazes

768 BARABAS ET AL

distortion compensation calibration and actual experi-mental use this approach is more likely to produce goodresults In the first experiment our method provided animprovement over omitting orientation distortion com-pensation entirely since sensor alignment was similar incalibration and experiment It is likely however that dis-tortion still plays a role in gaze point errors In the sec-ond experiment we found even lower (than that reportedin the Results section) mean gaze point error when mag-netic tracker measurements were corrected for locationdistortion but not for orientation distortion As the mag-netic tracker was rotated on its side when attached to theheadgear of the head-mounted sight (Figure 2) it was nolonger in the relative orientation needed to see a benefitfrom our compensation for orientation distortion Ex-panding the distortion compensation calibration proce-dure to measure the sensor at multiple orientations foreach grid location sampled could further reduce errordue to distortion This change would increase both thenumber of data points required and the time needed toperform the initial data collection procedure but it wouldnot impact per-participant calibration demands

The geometric calibration algorithm seeks a tOumlB andtSumlE yielding a local minimum in gaze point error start-ing from the initial guess derived from hand measure-ments of these parameters Control experiments wereperformed to verify that no other potentially lower min-ima could be found near the minima arrived at by cali-

bration Although the minimization procedure failed toconverge for intentionally inconsistent initial guessestests conducted using parameters displaced from hand-measured values by several inches in all directions allstill converged on the same LoPG Thus uncorrecteddistortion in magnetic tracker measurements was mostlikely the leading cause of gaze point error

Comparison between points of gaze computed with(Figure 6D) and without (Figure 6C) the geometric cali-bration step shows that attempts to hand measure the spa-tial relationships tOumlB and tSumlE (ie without geometriccalibration) lead to relatively large gaze point errors Ap-plication of the geometric fitting procedure substantiallyimproves the ability of the model to compute LoPG Al-though it is not surprising that better results can be ob-tained when calibration is performed it is interesting tonote that the parameters resulting from calibration deviatedconsiderably from hand-measured parameters In factthe optimized parameters comprising tSumlE indicatedspatial relationships that were clearly inconsistent withthe actual configuration of the laser and magnetic sensorFor example parameters producing the least mean gazepoint error contained values for the distance between themagnetic sensor and the LoPG that were several inchesgreater than physical measurements indicated Althoughthere is some imprecision (we estimate about a quarter ofan inch) in measuring straight-line distance between eyeand magnetic sensor this imprecision alone cannot ex-

mean across all orderings95 confidence interval

14

12

10

8

6

4

2

02 10 20 30 40 50

Number of data points in calibration set

Mea

n ga

ze p

oint

err

or a

cros

s al

l 100

dat

a po

ints

(in

ches

)

Figure 9 Error in computed point of gaze derived when model was calibrated with in-creasing numbers of calibration points across 100 random orderings of the data set collectedin the second experiment (with a human participant) Mean gaze point error was reduced toless than 2 in (about 2ordm) in about 12 directed gazes

TRACKING THE LINE OF PRIMARY GAZE 769

plain the difference Since calibration performed withdata from the human participant seemed less prone tothis inconsistency we suspect that the constraints im-posed on the movement of the laser and sensor by the tri-pod head may be responsible In positioning the laser toalign it with calibration targets the sensor remainedwithin 47ordm of level and ldquorollrdquo of the sensor could not beperformed These constraints lead to less variation in thecalibration data and in turn may have underconstrainedthe fitting procedure used in geometric calibration

A limitation of the geometric calibration procedure de-scribed here is the inability of the calibration procedureto locate the eye along the LoPG Although this limita-tion does not impact the ability of the system to predictpoints of gaze on the screen it does impact the veracityof the model parameters Although the geometric modelused specifies the eye as a coordinate frame with an ori-gin located some distance from the magnetic tracker sen-sor the geometric calibration procedure constrains onlythe x-axis of this coordinate frame This results in an in-finite number of possible sets of model parameters thatstill predict the same point of gaze on the screen (iescreen intersection remains the same even if the eye wereto slide forward or back along the LoPG or to rotate aboutits x-axis) This means that the parameters making uptSumlE found by the fitting procedure correspond to onepossible location of the eye along the LoPG Control ex-periments showed that the fitted location of the eye alongthe LoPG depended on the initial guess used for the fit-ting procedure even though these alternate initialguesses still yielded the same points of gaze5 If the in-termediate results of the model are to be used for exam-ple for computing the location of the eye relative to thescreen for placement of a camera in a virtual environ-ment the inability of this system to precisely locate theeye might be overcome by performing an additional cal-ibration For example a calibration similar to the oneused in this article but requiring the participant to placethe eye in other nonprimary positions of gaze could beused to find actual eye location This technique is cur-rently being investigated

REFERENCES

Allison R S Eizenman M amp Cheung B S K (1996) Combinedhead and eye tracking system for dynamic testing of the vestibularsystem IEEE Transactions on Biomedical Engineering 43 1073-1082

Asai K Osawa N Takahashi H amp Sugimoto Y Y (2000 March)Eye mark pointer in immersive projection display Paper presented atthe IEEE Virtual Reality 2000 Conference New Brunswick NJ

Atchison D A amp Smith G (2000) Optics of the human eye OxfordButterworth-Heinemann

Barabas J Giorgi R G Goldstein R B Apfelbaum H WoodsR L amp Peli E (2003 May) Wide field 3D gaze tracking systemPaper presented at the Association for Research in Vision and Oph-thalmology Meeting Fort Lauderdale FL

Blood E (1990) US Patent No 4945305 Washington DC USPatent amp Trademark Office

Bryson S (1992 February) Measurement and calibration of staticdistortion of position data from 3D trackers Paper presented at SPIEStereoscopic Displays amp Applications III San Jose

Caldwell R T Berbaum K S amp Borah J (2000) Correcting er-rors in eye-position data arising from the distortion of magnetic fieldsby display devices Behavior Research Methods Instruments ampComputers 32 572-578

Day J S Dumas G A amp Murdoch D J (1998) Evaluation of along-range transmitter for use with a magnetic tracking device in mo-tion analysis Journal of Biomechanics 31 957-961

Duchowski A Medlin E Cournia N Murphy H Gramopad-hye A Nair S Vorah J amp Melloy B (2002) 3-D eye move-ment analysis Behavior Research Methods Instruments amp Comput-ers 34 573-591

Ikits M Brederson J D Hansen C D amp Hollerbach J M(2001 March) An improved calibration framework for electromag-netic tracking devices Paper presented at the Virtual Reality 2001Proceedings IEEE Yokohama

Janin A Mizell D amp Caudell T (1993 September) Calibrationof head-mounted displays for augmented reality applications Paperpresented at the Virtual Reality Annual International SymposiumSeattle

Kindratenko V (1999) Calibration of electromagnetic tracking de-vices Virtual Reality Research Development amp Applications 4139-150

Kindratenko V (2000) A survey of electromagnetic position trackercalibration techniques Virtual Reality Research Development ampApplications 5 169-182

King P (1995) Integration of helmet-mounted displays into tacticalaircraft Proceedings of the Society for Information Display 26 663-668

Kuipers J B (1999) Quaternions and rotation sequences A primerwith applications to orbits aerospace and virtual reality PrincetonNJ Princeton University Press

Leigh R J amp Zee D S (1983) The neurology of eye movementsPhiladelphia Davis

Livingston M A amp State A (1997) Magnetic tracker calibrationfor improved augmented reality registration Presence Teleoperatorsamp Virtual Environments 6 532-546

Nixon M A McCallum B C Fright W R amp Price N B (1998)The effects of metals and interfering fields on electromagnetic track-ers Presence 7 204-218

Southard D A (1995) Viewing model for virtual environment dis-plays Journal of Electronic Imaging 4 413-420

Young L R amp Sheena D (1975) Survey of eye movement recordingmethods Behavior Research Methods amp Instrumentation 7 397-429

NOTES

1 This position is used in projected virtual reality displays to deter-mine how to position the virtual camera when generating an image of ascene

2 Even if the tracker is not aligned exactly with the case of the trans-mitter the correction polynomials (Equations 1 and 2) we used includeterms to account for such an offset These offsets are contained in theconstant (i 0) and linear (i 1) terms of polynomials

3 Two linear systems were constructed from 192 six-variable mea-surements Each of the two systems contained 192 3 576 knownsand 105 unknowns

4 It should be noted that the calibration procedure described heredoes not completely constrain the model parameters used Since no mea-surement of orientation on the screen of the point of gaze or distancefrom eye to screen during a sighting is made neither of these can be pre-dicted by the model As a result the best-fitting model parameters de-scribing tSumlE may converge on any one of a line of solutions corre-sponding to possible locations of the eye along the LoPG Each of theseeye locations predicts the same LoPG location on the screen It shouldbe possible to compute correct eye position with a calibration proce-dure that includes eye-in-head rotation

5 For this reason we have chosen to report errors in inches along thescreen surface since angular measures of error are more sensitive to lo-cation of the eye along the LoPG Where angular error is reported weestimated the eyendashscreen distance to be the distance between the mag-netic sensor and the screen

770 BARABAS ET AL

ARCHIVED MATERIALS

The following materials associated with this article are retrievablefrom the Psychonomic Societyrsquos Norms Stimuli and Data archivehttpwwwpsychonomicorgarchive

To access the above files or links search the archive for this articleusing the journal (Behavior Research Methods Instruments amp Com-puters) the first authorrsquos name (Barabas) and the publication year(2004)

FILE Barabas-BRMIC-2004zipDESCRIPTION The compressed archive file contains seven files The

files areREADMEtxt containing a description of scripts and data included

in the archive (a 3K text file) This file can renamed to readmem andexecuted as a Matlab script to reproduce some of the results reported inthis manuscript

barabas2004LoPGmat a Matlab workspace file containing mag-netic tracker data collected in the verification experiments described inthis manuscript (a 32K binary file)

findcorrectioncoeffsm a Matlab function for finding the distortioncompensation coefficients for Equations 1 and 2 (a 5K Matlab M-file)

correctdistortionm a Matlab function for performing distortioncompensation on magnetic tracker data using Equations 1 and 2 (a 4KMatlab M-file)

findmodelparamsm a Matlab function for finding the parameters ofthe LoPG model from data collected during a series of directed gazes (a3K Matlab M-file)

intersecterrorm a Matlab function for finding gaze point error for aset of directed gazes (a 5K Matlab M-file)

trackereulerstomtx4m a Matlab function that computes a transfor-mation matrix representation of a relationship between coordinateframes via Equation 4 (a 3K Matlab M-file)

AUTHORrsquoS WEB SITE httpwwweriharvardedufacultypeliindexhtml

AUTHORrsquoS E-MAIL ADDRESS barabasmitedu

(Manuscript received July 18 2003revision accepted for publication July 23 2004)

766 BARABAS ET AL

This may be because for tripod locations closer to thescreen an angular error (due to distortion for example)of the same size results in a larger error distance alongthe screen

Short calibration In order to use our LoPG-trackingsystem as a tool for gathering data from untrained studyparticipants it is advantageous to calibrate the systemusing as few data points as necessary (eg to reduce fa-tigue) To find the number of calibration data pointsneeded to accurately predict point of gaze we createdcalibration sets from subsets of the 100 measurementscollected in each experiment We used these small cali-bration sets to compute tOumlB and tSumlE and then usedthose computed parameters to calculate screen intersec-tions across the full set of 100 measurements We thencompared these computed intersections with the loca-tions of the actual locations of the on-screen targets Weperformed this technique for calibration sets of varyingsizes across 100 random orderings of the 100 data pointscollected in each experiment For the first experimentwe found that after about 20 points the geometric cali-

bration system converges to a point of gaze with a meangaze point error of less than 2 in with only small im-provements attained by including additional points in thecalibration set (Figure 8)

For the second experiment overall gaze point errorwas lower when calibration was performed with the en-tire data set For this data set 2 in of error could bereached with about 12 data points (Figure 9) Calibra-tions performed with more than 15 data points providedonly a small additional benefit

In the processing of calibration data from both exper-iments geometric calibration occasionally failed com-pletely producing larger mean gaze point error than ifgeometric calibration had not been performed at allThese failures occurred with very small calibration setswhen distortion-related errors provided inconsistent con-straints for the model Failures also occurred when setsincluded targets from only one part of the screen insuf-ficiently constraining the model Fitting failures neveroccurred with sets of 10 points or more and never occurredwhen distortion compensation was performed In all the

Figure 7 The mean gaze point error at each of the 10 tripod locations on the left using the raw data from themagnetic tracker collected in the first experiment (see Figure 6A) and on the right using the same data correctedusing both the distortion compensation and the geometric calibration (Figure 6D) The tripod and laser pointerwere directed toward the projection screen which is shown as the tall white rectangle on the right of each panelThe tracker transmitter is shown as the black square Each circle represents one of the tripod locations with ra-dius equal to the mean gaze point error from that location Mean gaze point error for each location is also shownwith the corresponding circle

TRACKING THE LINE OF PRIMARY GAZE 767

cases in which calibration failed the addition of one ortwo calibration points to the set was enough to reversethe failure In general better fits were obtained from cal-ibration sets with greater variety in sampled tracker ori-entations and screen locations

DISCUSSION

The LoPG-tracking model and calibration techniquespresented here succeed at accurately predicting the on-screen point of gaze while participants look through ahead-mounted sight The validation experiments showthat in agreement with Caldwell et al (2000) the use ofpolynomial distortion compensation can improve the ac-curacy of gaze point calculations (Figures 6C and 6D)In addition benefit was also seen from geometric cali-bration The techniques presented for tracking LoPGprovide a foundation for future gaze-tracking systems

An interesting result of performing calibration bothwith and without distortion compensation was that al-though point of gaze was predicted more accurately withdistortion compensation the tracking system still per-formed relatively well in the presence of uncompensateddistortion provided that geometric calibration was per-formed In the presence of measurement errors causedby distortion the minimization procedure gives modelparameters that may not correspond well to the actualspatial relationships being modeled but interact with the

measurement distortion to produce reasonably accuratescreen intersection predictions If somewhat larger LoPG-tracking errors can be tolerated one might be able toachieve reasonable results by omitting distortion compen-sation altogether Omission of distortion compensationwould likely result in much greater error however for headpositions or orientations outside the range of calibration

Possible sources of gaze point errors remaining whenboth geometric calibration and distortion compensationwere performed include measurement noise inaccura-cies in magnetic tracker measurements not corrected bythe distortion compensation system inaccuracy of themodel parameters found by the geometric calibrationsystem or some combination of these Tests of stabilityin the tracker measurements showed that readings fromthe tracker taken milliseconds or hours apart agreedboth differ from reference measurements by only a fewhundredths of an inch and by a few tenths of a degreeThis small measurement noise is clearly not enough toexplain the gaze point errors which are orders of mag-nitude greater

Our method makes the simplifying assumption thatthe distortions in magnetic tracker orientation readingsdepend on location of the magnetic sensor only and noton the interaction between sensor location and orienta-tion Therefore it is likely that our method did not com-pletely correct for the magnetic distortion When themagnetic sensor is kept in a similar orientation in both

mean across all orderings95 confidence interval

14

12

10

8

6

4

2

02 10 20 30 40 50

Number of data points in calibration set

Mea

n ga

ze p

oint

err

or a

cros

s al

l 100

dat

a po

ints

(in

ches

)

Figure 8 Error in computed point of gaze derived when model was calibrated with in-creasing numbers of calibration points across 100 random orderings of the data set collectedin the first experiment (using a laser pointer) Mean gaze point error was reduced to within2 in (about 2ordm) after about 23 directed gazes

768 BARABAS ET AL

distortion compensation calibration and actual experi-mental use this approach is more likely to produce goodresults In the first experiment our method provided animprovement over omitting orientation distortion com-pensation entirely since sensor alignment was similar incalibration and experiment It is likely however that dis-tortion still plays a role in gaze point errors In the sec-ond experiment we found even lower (than that reportedin the Results section) mean gaze point error when mag-netic tracker measurements were corrected for locationdistortion but not for orientation distortion As the mag-netic tracker was rotated on its side when attached to theheadgear of the head-mounted sight (Figure 2) it was nolonger in the relative orientation needed to see a benefitfrom our compensation for orientation distortion Ex-panding the distortion compensation calibration proce-dure to measure the sensor at multiple orientations foreach grid location sampled could further reduce errordue to distortion This change would increase both thenumber of data points required and the time needed toperform the initial data collection procedure but it wouldnot impact per-participant calibration demands

The geometric calibration algorithm seeks a tOumlB andtSumlE yielding a local minimum in gaze point error start-ing from the initial guess derived from hand measure-ments of these parameters Control experiments wereperformed to verify that no other potentially lower min-ima could be found near the minima arrived at by cali-

bration Although the minimization procedure failed toconverge for intentionally inconsistent initial guessestests conducted using parameters displaced from hand-measured values by several inches in all directions allstill converged on the same LoPG Thus uncorrecteddistortion in magnetic tracker measurements was mostlikely the leading cause of gaze point error

Comparison between points of gaze computed with(Figure 6D) and without (Figure 6C) the geometric cali-bration step shows that attempts to hand measure the spa-tial relationships tOumlB and tSumlE (ie without geometriccalibration) lead to relatively large gaze point errors Ap-plication of the geometric fitting procedure substantiallyimproves the ability of the model to compute LoPG Al-though it is not surprising that better results can be ob-tained when calibration is performed it is interesting tonote that the parameters resulting from calibration deviatedconsiderably from hand-measured parameters In factthe optimized parameters comprising tSumlE indicatedspatial relationships that were clearly inconsistent withthe actual configuration of the laser and magnetic sensorFor example parameters producing the least mean gazepoint error contained values for the distance between themagnetic sensor and the LoPG that were several inchesgreater than physical measurements indicated Althoughthere is some imprecision (we estimate about a quarter ofan inch) in measuring straight-line distance between eyeand magnetic sensor this imprecision alone cannot ex-

mean across all orderings95 confidence interval

14

12

10

8

6

4

2

02 10 20 30 40 50

Number of data points in calibration set

Mea

n ga

ze p

oint

err

or a

cros

s al

l 100

dat

a po

ints

(in

ches

)

Figure 9 Error in computed point of gaze derived when model was calibrated with in-creasing numbers of calibration points across 100 random orderings of the data set collectedin the second experiment (with a human participant) Mean gaze point error was reduced toless than 2 in (about 2ordm) in about 12 directed gazes

TRACKING THE LINE OF PRIMARY GAZE 769

plain the difference Since calibration performed withdata from the human participant seemed less prone tothis inconsistency we suspect that the constraints im-posed on the movement of the laser and sensor by the tri-pod head may be responsible In positioning the laser toalign it with calibration targets the sensor remainedwithin 47ordm of level and ldquorollrdquo of the sensor could not beperformed These constraints lead to less variation in thecalibration data and in turn may have underconstrainedthe fitting procedure used in geometric calibration

A limitation of the geometric calibration procedure de-scribed here is the inability of the calibration procedureto locate the eye along the LoPG Although this limita-tion does not impact the ability of the system to predictpoints of gaze on the screen it does impact the veracityof the model parameters Although the geometric modelused specifies the eye as a coordinate frame with an ori-gin located some distance from the magnetic tracker sen-sor the geometric calibration procedure constrains onlythe x-axis of this coordinate frame This results in an in-finite number of possible sets of model parameters thatstill predict the same point of gaze on the screen (iescreen intersection remains the same even if the eye wereto slide forward or back along the LoPG or to rotate aboutits x-axis) This means that the parameters making uptSumlE found by the fitting procedure correspond to onepossible location of the eye along the LoPG Control ex-periments showed that the fitted location of the eye alongthe LoPG depended on the initial guess used for the fit-ting procedure even though these alternate initialguesses still yielded the same points of gaze5 If the in-termediate results of the model are to be used for exam-ple for computing the location of the eye relative to thescreen for placement of a camera in a virtual environ-ment the inability of this system to precisely locate theeye might be overcome by performing an additional cal-ibration For example a calibration similar to the oneused in this article but requiring the participant to placethe eye in other nonprimary positions of gaze could beused to find actual eye location This technique is cur-rently being investigated

REFERENCES

Allison R S Eizenman M amp Cheung B S K (1996) Combinedhead and eye tracking system for dynamic testing of the vestibularsystem IEEE Transactions on Biomedical Engineering 43 1073-1082

Asai K Osawa N Takahashi H amp Sugimoto Y Y (2000 March)Eye mark pointer in immersive projection display Paper presented atthe IEEE Virtual Reality 2000 Conference New Brunswick NJ

Atchison D A amp Smith G (2000) Optics of the human eye OxfordButterworth-Heinemann

Barabas J Giorgi R G Goldstein R B Apfelbaum H WoodsR L amp Peli E (2003 May) Wide field 3D gaze tracking systemPaper presented at the Association for Research in Vision and Oph-thalmology Meeting Fort Lauderdale FL

Blood E (1990) US Patent No 4945305 Washington DC USPatent amp Trademark Office

Bryson S (1992 February) Measurement and calibration of staticdistortion of position data from 3D trackers Paper presented at SPIEStereoscopic Displays amp Applications III San Jose

Caldwell R T Berbaum K S amp Borah J (2000) Correcting er-rors in eye-position data arising from the distortion of magnetic fieldsby display devices Behavior Research Methods Instruments ampComputers 32 572-578

Day J S Dumas G A amp Murdoch D J (1998) Evaluation of along-range transmitter for use with a magnetic tracking device in mo-tion analysis Journal of Biomechanics 31 957-961

Duchowski A Medlin E Cournia N Murphy H Gramopad-hye A Nair S Vorah J amp Melloy B (2002) 3-D eye move-ment analysis Behavior Research Methods Instruments amp Comput-ers 34 573-591

Ikits M Brederson J D Hansen C D amp Hollerbach J M(2001 March) An improved calibration framework for electromag-netic tracking devices Paper presented at the Virtual Reality 2001Proceedings IEEE Yokohama

Janin A Mizell D amp Caudell T (1993 September) Calibrationof head-mounted displays for augmented reality applications Paperpresented at the Virtual Reality Annual International SymposiumSeattle

Kindratenko V (1999) Calibration of electromagnetic tracking de-vices Virtual Reality Research Development amp Applications 4139-150

Kindratenko V (2000) A survey of electromagnetic position trackercalibration techniques Virtual Reality Research Development ampApplications 5 169-182

King P (1995) Integration of helmet-mounted displays into tacticalaircraft Proceedings of the Society for Information Display 26 663-668

Kuipers J B (1999) Quaternions and rotation sequences A primerwith applications to orbits aerospace and virtual reality PrincetonNJ Princeton University Press

Leigh R J amp Zee D S (1983) The neurology of eye movementsPhiladelphia Davis

Livingston M A amp State A (1997) Magnetic tracker calibrationfor improved augmented reality registration Presence Teleoperatorsamp Virtual Environments 6 532-546

Nixon M A McCallum B C Fright W R amp Price N B (1998)The effects of metals and interfering fields on electromagnetic track-ers Presence 7 204-218

Southard D A (1995) Viewing model for virtual environment dis-plays Journal of Electronic Imaging 4 413-420

Young L R amp Sheena D (1975) Survey of eye movement recordingmethods Behavior Research Methods amp Instrumentation 7 397-429

NOTES

1 This position is used in projected virtual reality displays to deter-mine how to position the virtual camera when generating an image of ascene

2 Even if the tracker is not aligned exactly with the case of the trans-mitter the correction polynomials (Equations 1 and 2) we used includeterms to account for such an offset These offsets are contained in theconstant (i 0) and linear (i 1) terms of polynomials

3 Two linear systems were constructed from 192 six-variable mea-surements Each of the two systems contained 192 3 576 knownsand 105 unknowns

4 It should be noted that the calibration procedure described heredoes not completely constrain the model parameters used Since no mea-surement of orientation on the screen of the point of gaze or distancefrom eye to screen during a sighting is made neither of these can be pre-dicted by the model As a result the best-fitting model parameters de-scribing tSumlE may converge on any one of a line of solutions corre-sponding to possible locations of the eye along the LoPG Each of theseeye locations predicts the same LoPG location on the screen It shouldbe possible to compute correct eye position with a calibration proce-dure that includes eye-in-head rotation

5 For this reason we have chosen to report errors in inches along thescreen surface since angular measures of error are more sensitive to lo-cation of the eye along the LoPG Where angular error is reported weestimated the eyendashscreen distance to be the distance between the mag-netic sensor and the screen

770 BARABAS ET AL

ARCHIVED MATERIALS

The following materials associated with this article are retrievablefrom the Psychonomic Societyrsquos Norms Stimuli and Data archivehttpwwwpsychonomicorgarchive

To access the above files or links search the archive for this articleusing the journal (Behavior Research Methods Instruments amp Com-puters) the first authorrsquos name (Barabas) and the publication year(2004)

FILE Barabas-BRMIC-2004zipDESCRIPTION The compressed archive file contains seven files The

files areREADMEtxt containing a description of scripts and data included

in the archive (a 3K text file) This file can renamed to readmem andexecuted as a Matlab script to reproduce some of the results reported inthis manuscript

barabas2004LoPGmat a Matlab workspace file containing mag-netic tracker data collected in the verification experiments described inthis manuscript (a 32K binary file)

findcorrectioncoeffsm a Matlab function for finding the distortioncompensation coefficients for Equations 1 and 2 (a 5K Matlab M-file)

correctdistortionm a Matlab function for performing distortioncompensation on magnetic tracker data using Equations 1 and 2 (a 4KMatlab M-file)

findmodelparamsm a Matlab function for finding the parameters ofthe LoPG model from data collected during a series of directed gazes (a3K Matlab M-file)

intersecterrorm a Matlab function for finding gaze point error for aset of directed gazes (a 5K Matlab M-file)

trackereulerstomtx4m a Matlab function that computes a transfor-mation matrix representation of a relationship between coordinateframes via Equation 4 (a 3K Matlab M-file)

AUTHORrsquoS WEB SITE httpwwweriharvardedufacultypeliindexhtml

AUTHORrsquoS E-MAIL ADDRESS barabasmitedu

(Manuscript received July 18 2003revision accepted for publication July 23 2004)

TRACKING THE LINE OF PRIMARY GAZE 767

cases in which calibration failed the addition of one ortwo calibration points to the set was enough to reversethe failure In general better fits were obtained from cal-ibration sets with greater variety in sampled tracker ori-entations and screen locations

DISCUSSION

The LoPG-tracking model and calibration techniquespresented here succeed at accurately predicting the on-screen point of gaze while participants look through ahead-mounted sight The validation experiments showthat in agreement with Caldwell et al (2000) the use ofpolynomial distortion compensation can improve the ac-curacy of gaze point calculations (Figures 6C and 6D)In addition benefit was also seen from geometric cali-bration The techniques presented for tracking LoPGprovide a foundation for future gaze-tracking systems

An interesting result of performing calibration bothwith and without distortion compensation was that al-though point of gaze was predicted more accurately withdistortion compensation the tracking system still per-formed relatively well in the presence of uncompensateddistortion provided that geometric calibration was per-formed In the presence of measurement errors causedby distortion the minimization procedure gives modelparameters that may not correspond well to the actualspatial relationships being modeled but interact with the

measurement distortion to produce reasonably accuratescreen intersection predictions If somewhat larger LoPG-tracking errors can be tolerated one might be able toachieve reasonable results by omitting distortion compen-sation altogether Omission of distortion compensationwould likely result in much greater error however for headpositions or orientations outside the range of calibration

Possible sources of gaze point errors remaining whenboth geometric calibration and distortion compensationwere performed include measurement noise inaccura-cies in magnetic tracker measurements not corrected bythe distortion compensation system inaccuracy of themodel parameters found by the geometric calibrationsystem or some combination of these Tests of stabilityin the tracker measurements showed that readings fromthe tracker taken milliseconds or hours apart agreedboth differ from reference measurements by only a fewhundredths of an inch and by a few tenths of a degreeThis small measurement noise is clearly not enough toexplain the gaze point errors which are orders of mag-nitude greater

Our method makes the simplifying assumption thatthe distortions in magnetic tracker orientation readingsdepend on location of the magnetic sensor only and noton the interaction between sensor location and orienta-tion Therefore it is likely that our method did not com-pletely correct for the magnetic distortion When themagnetic sensor is kept in a similar orientation in both

mean across all orderings95 confidence interval

14

12

10

8

6

4

2

02 10 20 30 40 50

Number of data points in calibration set

Mea

n ga

ze p

oint

err

or a

cros

s al

l 100

dat

a po

ints

(in

ches

)

Figure 8 Error in computed point of gaze derived when model was calibrated with in-creasing numbers of calibration points across 100 random orderings of the data set collectedin the first experiment (using a laser pointer) Mean gaze point error was reduced to within2 in (about 2ordm) after about 23 directed gazes

768 BARABAS ET AL

distortion compensation calibration and actual experi-mental use this approach is more likely to produce goodresults In the first experiment our method provided animprovement over omitting orientation distortion com-pensation entirely since sensor alignment was similar incalibration and experiment It is likely however that dis-tortion still plays a role in gaze point errors In the sec-ond experiment we found even lower (than that reportedin the Results section) mean gaze point error when mag-netic tracker measurements were corrected for locationdistortion but not for orientation distortion As the mag-netic tracker was rotated on its side when attached to theheadgear of the head-mounted sight (Figure 2) it was nolonger in the relative orientation needed to see a benefitfrom our compensation for orientation distortion Ex-panding the distortion compensation calibration proce-dure to measure the sensor at multiple orientations foreach grid location sampled could further reduce errordue to distortion This change would increase both thenumber of data points required and the time needed toperform the initial data collection procedure but it wouldnot impact per-participant calibration demands

The geometric calibration algorithm seeks a tOumlB andtSumlE yielding a local minimum in gaze point error start-ing from the initial guess derived from hand measure-ments of these parameters Control experiments wereperformed to verify that no other potentially lower min-ima could be found near the minima arrived at by cali-

bration Although the minimization procedure failed toconverge for intentionally inconsistent initial guessestests conducted using parameters displaced from hand-measured values by several inches in all directions allstill converged on the same LoPG Thus uncorrecteddistortion in magnetic tracker measurements was mostlikely the leading cause of gaze point error

Comparison between points of gaze computed with(Figure 6D) and without (Figure 6C) the geometric cali-bration step shows that attempts to hand measure the spa-tial relationships tOumlB and tSumlE (ie without geometriccalibration) lead to relatively large gaze point errors Ap-plication of the geometric fitting procedure substantiallyimproves the ability of the model to compute LoPG Al-though it is not surprising that better results can be ob-tained when calibration is performed it is interesting tonote that the parameters resulting from calibration deviatedconsiderably from hand-measured parameters In factthe optimized parameters comprising tSumlE indicatedspatial relationships that were clearly inconsistent withthe actual configuration of the laser and magnetic sensorFor example parameters producing the least mean gazepoint error contained values for the distance between themagnetic sensor and the LoPG that were several inchesgreater than physical measurements indicated Althoughthere is some imprecision (we estimate about a quarter ofan inch) in measuring straight-line distance between eyeand magnetic sensor this imprecision alone cannot ex-

mean across all orderings95 confidence interval

14

12

10

8

6

4

2

02 10 20 30 40 50

Number of data points in calibration set

Mea

n ga

ze p

oint

err

or a

cros

s al

l 100

dat

a po

ints

(in

ches

)

Figure 9 Error in computed point of gaze derived when model was calibrated with in-creasing numbers of calibration points across 100 random orderings of the data set collectedin the second experiment (with a human participant) Mean gaze point error was reduced toless than 2 in (about 2ordm) in about 12 directed gazes

TRACKING THE LINE OF PRIMARY GAZE 769

plain the difference Since calibration performed withdata from the human participant seemed less prone tothis inconsistency we suspect that the constraints im-posed on the movement of the laser and sensor by the tri-pod head may be responsible In positioning the laser toalign it with calibration targets the sensor remainedwithin 47ordm of level and ldquorollrdquo of the sensor could not beperformed These constraints lead to less variation in thecalibration data and in turn may have underconstrainedthe fitting procedure used in geometric calibration

A limitation of the geometric calibration procedure de-scribed here is the inability of the calibration procedureto locate the eye along the LoPG Although this limita-tion does not impact the ability of the system to predictpoints of gaze on the screen it does impact the veracityof the model parameters Although the geometric modelused specifies the eye as a coordinate frame with an ori-gin located some distance from the magnetic tracker sen-sor the geometric calibration procedure constrains onlythe x-axis of this coordinate frame This results in an in-finite number of possible sets of model parameters thatstill predict the same point of gaze on the screen (iescreen intersection remains the same even if the eye wereto slide forward or back along the LoPG or to rotate aboutits x-axis) This means that the parameters making uptSumlE found by the fitting procedure correspond to onepossible location of the eye along the LoPG Control ex-periments showed that the fitted location of the eye alongthe LoPG depended on the initial guess used for the fit-ting procedure even though these alternate initialguesses still yielded the same points of gaze5 If the in-termediate results of the model are to be used for exam-ple for computing the location of the eye relative to thescreen for placement of a camera in a virtual environ-ment the inability of this system to precisely locate theeye might be overcome by performing an additional cal-ibration For example a calibration similar to the oneused in this article but requiring the participant to placethe eye in other nonprimary positions of gaze could beused to find actual eye location This technique is cur-rently being investigated

REFERENCES

Allison R S Eizenman M amp Cheung B S K (1996) Combinedhead and eye tracking system for dynamic testing of the vestibularsystem IEEE Transactions on Biomedical Engineering 43 1073-1082

Asai K Osawa N Takahashi H amp Sugimoto Y Y (2000 March)Eye mark pointer in immersive projection display Paper presented atthe IEEE Virtual Reality 2000 Conference New Brunswick NJ

Atchison D A amp Smith G (2000) Optics of the human eye OxfordButterworth-Heinemann

Barabas J Giorgi R G Goldstein R B Apfelbaum H WoodsR L amp Peli E (2003 May) Wide field 3D gaze tracking systemPaper presented at the Association for Research in Vision and Oph-thalmology Meeting Fort Lauderdale FL

Blood E (1990) US Patent No 4945305 Washington DC USPatent amp Trademark Office

Bryson S (1992 February) Measurement and calibration of staticdistortion of position data from 3D trackers Paper presented at SPIEStereoscopic Displays amp Applications III San Jose

Caldwell R T Berbaum K S amp Borah J (2000) Correcting er-rors in eye-position data arising from the distortion of magnetic fieldsby display devices Behavior Research Methods Instruments ampComputers 32 572-578

Day J S Dumas G A amp Murdoch D J (1998) Evaluation of along-range transmitter for use with a magnetic tracking device in mo-tion analysis Journal of Biomechanics 31 957-961

Duchowski A Medlin E Cournia N Murphy H Gramopad-hye A Nair S Vorah J amp Melloy B (2002) 3-D eye move-ment analysis Behavior Research Methods Instruments amp Comput-ers 34 573-591

Ikits M Brederson J D Hansen C D amp Hollerbach J M(2001 March) An improved calibration framework for electromag-netic tracking devices Paper presented at the Virtual Reality 2001Proceedings IEEE Yokohama

Janin A Mizell D amp Caudell T (1993 September) Calibrationof head-mounted displays for augmented reality applications Paperpresented at the Virtual Reality Annual International SymposiumSeattle

Kindratenko V (1999) Calibration of electromagnetic tracking de-vices Virtual Reality Research Development amp Applications 4139-150

Kindratenko V (2000) A survey of electromagnetic position trackercalibration techniques Virtual Reality Research Development ampApplications 5 169-182

King P (1995) Integration of helmet-mounted displays into tacticalaircraft Proceedings of the Society for Information Display 26 663-668

Kuipers J B (1999) Quaternions and rotation sequences A primerwith applications to orbits aerospace and virtual reality PrincetonNJ Princeton University Press

Leigh R J amp Zee D S (1983) The neurology of eye movementsPhiladelphia Davis

Livingston M A amp State A (1997) Magnetic tracker calibrationfor improved augmented reality registration Presence Teleoperatorsamp Virtual Environments 6 532-546

Nixon M A McCallum B C Fright W R amp Price N B (1998)The effects of metals and interfering fields on electromagnetic track-ers Presence 7 204-218

Southard D A (1995) Viewing model for virtual environment dis-plays Journal of Electronic Imaging 4 413-420

Young L R amp Sheena D (1975) Survey of eye movement recordingmethods Behavior Research Methods amp Instrumentation 7 397-429

NOTES

1 This position is used in projected virtual reality displays to deter-mine how to position the virtual camera when generating an image of ascene

2 Even if the tracker is not aligned exactly with the case of the trans-mitter the correction polynomials (Equations 1 and 2) we used includeterms to account for such an offset These offsets are contained in theconstant (i 0) and linear (i 1) terms of polynomials

3 Two linear systems were constructed from 192 six-variable mea-surements Each of the two systems contained 192 3 576 knownsand 105 unknowns

4 It should be noted that the calibration procedure described heredoes not completely constrain the model parameters used Since no mea-surement of orientation on the screen of the point of gaze or distancefrom eye to screen during a sighting is made neither of these can be pre-dicted by the model As a result the best-fitting model parameters de-scribing tSumlE may converge on any one of a line of solutions corre-sponding to possible locations of the eye along the LoPG Each of theseeye locations predicts the same LoPG location on the screen It shouldbe possible to compute correct eye position with a calibration proce-dure that includes eye-in-head rotation

5 For this reason we have chosen to report errors in inches along thescreen surface since angular measures of error are more sensitive to lo-cation of the eye along the LoPG Where angular error is reported weestimated the eyendashscreen distance to be the distance between the mag-netic sensor and the screen

770 BARABAS ET AL

ARCHIVED MATERIALS

The following materials associated with this article are retrievablefrom the Psychonomic Societyrsquos Norms Stimuli and Data archivehttpwwwpsychonomicorgarchive

To access the above files or links search the archive for this articleusing the journal (Behavior Research Methods Instruments amp Com-puters) the first authorrsquos name (Barabas) and the publication year(2004)

FILE Barabas-BRMIC-2004zipDESCRIPTION The compressed archive file contains seven files The

files areREADMEtxt containing a description of scripts and data included

in the archive (a 3K text file) This file can renamed to readmem andexecuted as a Matlab script to reproduce some of the results reported inthis manuscript

barabas2004LoPGmat a Matlab workspace file containing mag-netic tracker data collected in the verification experiments described inthis manuscript (a 32K binary file)

findcorrectioncoeffsm a Matlab function for finding the distortioncompensation coefficients for Equations 1 and 2 (a 5K Matlab M-file)

correctdistortionm a Matlab function for performing distortioncompensation on magnetic tracker data using Equations 1 and 2 (a 4KMatlab M-file)

findmodelparamsm a Matlab function for finding the parameters ofthe LoPG model from data collected during a series of directed gazes (a3K Matlab M-file)

intersecterrorm a Matlab function for finding gaze point error for aset of directed gazes (a 5K Matlab M-file)

trackereulerstomtx4m a Matlab function that computes a transfor-mation matrix representation of a relationship between coordinateframes via Equation 4 (a 3K Matlab M-file)

AUTHORrsquoS WEB SITE httpwwweriharvardedufacultypeliindexhtml

AUTHORrsquoS E-MAIL ADDRESS barabasmitedu

(Manuscript received July 18 2003revision accepted for publication July 23 2004)

768 BARABAS ET AL

distortion compensation calibration and actual experi-mental use this approach is more likely to produce goodresults In the first experiment our method provided animprovement over omitting orientation distortion com-pensation entirely since sensor alignment was similar incalibration and experiment It is likely however that dis-tortion still plays a role in gaze point errors In the sec-ond experiment we found even lower (than that reportedin the Results section) mean gaze point error when mag-netic tracker measurements were corrected for locationdistortion but not for orientation distortion As the mag-netic tracker was rotated on its side when attached to theheadgear of the head-mounted sight (Figure 2) it was nolonger in the relative orientation needed to see a benefitfrom our compensation for orientation distortion Ex-panding the distortion compensation calibration proce-dure to measure the sensor at multiple orientations foreach grid location sampled could further reduce errordue to distortion This change would increase both thenumber of data points required and the time needed toperform the initial data collection procedure but it wouldnot impact per-participant calibration demands

The geometric calibration algorithm seeks a tOumlB andtSumlE yielding a local minimum in gaze point error start-ing from the initial guess derived from hand measure-ments of these parameters Control experiments wereperformed to verify that no other potentially lower min-ima could be found near the minima arrived at by cali-

bration Although the minimization procedure failed toconverge for intentionally inconsistent initial guessestests conducted using parameters displaced from hand-measured values by several inches in all directions allstill converged on the same LoPG Thus uncorrecteddistortion in magnetic tracker measurements was mostlikely the leading cause of gaze point error

Comparison between points of gaze computed with(Figure 6D) and without (Figure 6C) the geometric cali-bration step shows that attempts to hand measure the spa-tial relationships tOumlB and tSumlE (ie without geometriccalibration) lead to relatively large gaze point errors Ap-plication of the geometric fitting procedure substantiallyimproves the ability of the model to compute LoPG Al-though it is not surprising that better results can be ob-tained when calibration is performed it is interesting tonote that the parameters resulting from calibration deviatedconsiderably from hand-measured parameters In factthe optimized parameters comprising tSumlE indicatedspatial relationships that were clearly inconsistent withthe actual configuration of the laser and magnetic sensorFor example parameters producing the least mean gazepoint error contained values for the distance between themagnetic sensor and the LoPG that were several inchesgreater than physical measurements indicated Althoughthere is some imprecision (we estimate about a quarter ofan inch) in measuring straight-line distance between eyeand magnetic sensor this imprecision alone cannot ex-

mean across all orderings95 confidence interval

14

12

10

8

6

4

2

02 10 20 30 40 50

Number of data points in calibration set

Mea

n ga

ze p

oint

err

or a

cros

s al

l 100

dat

a po

ints

(in

ches

)

Figure 9 Error in computed point of gaze derived when model was calibrated with in-creasing numbers of calibration points across 100 random orderings of the data set collectedin the second experiment (with a human participant) Mean gaze point error was reduced toless than 2 in (about 2ordm) in about 12 directed gazes

TRACKING THE LINE OF PRIMARY GAZE 769

plain the difference Since calibration performed withdata from the human participant seemed less prone tothis inconsistency we suspect that the constraints im-posed on the movement of the laser and sensor by the tri-pod head may be responsible In positioning the laser toalign it with calibration targets the sensor remainedwithin 47ordm of level and ldquorollrdquo of the sensor could not beperformed These constraints lead to less variation in thecalibration data and in turn may have underconstrainedthe fitting procedure used in geometric calibration

A limitation of the geometric calibration procedure de-scribed here is the inability of the calibration procedureto locate the eye along the LoPG Although this limita-tion does not impact the ability of the system to predictpoints of gaze on the screen it does impact the veracityof the model parameters Although the geometric modelused specifies the eye as a coordinate frame with an ori-gin located some distance from the magnetic tracker sen-sor the geometric calibration procedure constrains onlythe x-axis of this coordinate frame This results in an in-finite number of possible sets of model parameters thatstill predict the same point of gaze on the screen (iescreen intersection remains the same even if the eye wereto slide forward or back along the LoPG or to rotate aboutits x-axis) This means that the parameters making uptSumlE found by the fitting procedure correspond to onepossible location of the eye along the LoPG Control ex-periments showed that the fitted location of the eye alongthe LoPG depended on the initial guess used for the fit-ting procedure even though these alternate initialguesses still yielded the same points of gaze5 If the in-termediate results of the model are to be used for exam-ple for computing the location of the eye relative to thescreen for placement of a camera in a virtual environ-ment the inability of this system to precisely locate theeye might be overcome by performing an additional cal-ibration For example a calibration similar to the oneused in this article but requiring the participant to placethe eye in other nonprimary positions of gaze could beused to find actual eye location This technique is cur-rently being investigated

REFERENCES

Allison R S Eizenman M amp Cheung B S K (1996) Combinedhead and eye tracking system for dynamic testing of the vestibularsystem IEEE Transactions on Biomedical Engineering 43 1073-1082

Asai K Osawa N Takahashi H amp Sugimoto Y Y (2000 March)Eye mark pointer in immersive projection display Paper presented atthe IEEE Virtual Reality 2000 Conference New Brunswick NJ

Atchison D A amp Smith G (2000) Optics of the human eye OxfordButterworth-Heinemann

Barabas J Giorgi R G Goldstein R B Apfelbaum H WoodsR L amp Peli E (2003 May) Wide field 3D gaze tracking systemPaper presented at the Association for Research in Vision and Oph-thalmology Meeting Fort Lauderdale FL

Blood E (1990) US Patent No 4945305 Washington DC USPatent amp Trademark Office

Bryson S (1992 February) Measurement and calibration of staticdistortion of position data from 3D trackers Paper presented at SPIEStereoscopic Displays amp Applications III San Jose

Caldwell R T Berbaum K S amp Borah J (2000) Correcting er-rors in eye-position data arising from the distortion of magnetic fieldsby display devices Behavior Research Methods Instruments ampComputers 32 572-578

Day J S Dumas G A amp Murdoch D J (1998) Evaluation of along-range transmitter for use with a magnetic tracking device in mo-tion analysis Journal of Biomechanics 31 957-961

Duchowski A Medlin E Cournia N Murphy H Gramopad-hye A Nair S Vorah J amp Melloy B (2002) 3-D eye move-ment analysis Behavior Research Methods Instruments amp Comput-ers 34 573-591

Ikits M Brederson J D Hansen C D amp Hollerbach J M(2001 March) An improved calibration framework for electromag-netic tracking devices Paper presented at the Virtual Reality 2001Proceedings IEEE Yokohama

Janin A Mizell D amp Caudell T (1993 September) Calibrationof head-mounted displays for augmented reality applications Paperpresented at the Virtual Reality Annual International SymposiumSeattle

Kindratenko V (1999) Calibration of electromagnetic tracking de-vices Virtual Reality Research Development amp Applications 4139-150

Kindratenko V (2000) A survey of electromagnetic position trackercalibration techniques Virtual Reality Research Development ampApplications 5 169-182

King P (1995) Integration of helmet-mounted displays into tacticalaircraft Proceedings of the Society for Information Display 26 663-668

Kuipers J B (1999) Quaternions and rotation sequences A primerwith applications to orbits aerospace and virtual reality PrincetonNJ Princeton University Press

Leigh R J amp Zee D S (1983) The neurology of eye movementsPhiladelphia Davis

Livingston M A amp State A (1997) Magnetic tracker calibrationfor improved augmented reality registration Presence Teleoperatorsamp Virtual Environments 6 532-546

Nixon M A McCallum B C Fright W R amp Price N B (1998)The effects of metals and interfering fields on electromagnetic track-ers Presence 7 204-218

Southard D A (1995) Viewing model for virtual environment dis-plays Journal of Electronic Imaging 4 413-420

Young L R amp Sheena D (1975) Survey of eye movement recordingmethods Behavior Research Methods amp Instrumentation 7 397-429

NOTES

1 This position is used in projected virtual reality displays to deter-mine how to position the virtual camera when generating an image of ascene

2 Even if the tracker is not aligned exactly with the case of the trans-mitter the correction polynomials (Equations 1 and 2) we used includeterms to account for such an offset These offsets are contained in theconstant (i 0) and linear (i 1) terms of polynomials

3 Two linear systems were constructed from 192 six-variable mea-surements Each of the two systems contained 192 3 576 knownsand 105 unknowns

4 It should be noted that the calibration procedure described heredoes not completely constrain the model parameters used Since no mea-surement of orientation on the screen of the point of gaze or distancefrom eye to screen during a sighting is made neither of these can be pre-dicted by the model As a result the best-fitting model parameters de-scribing tSumlE may converge on any one of a line of solutions corre-sponding to possible locations of the eye along the LoPG Each of theseeye locations predicts the same LoPG location on the screen It shouldbe possible to compute correct eye position with a calibration proce-dure that includes eye-in-head rotation

5 For this reason we have chosen to report errors in inches along thescreen surface since angular measures of error are more sensitive to lo-cation of the eye along the LoPG Where angular error is reported weestimated the eyendashscreen distance to be the distance between the mag-netic sensor and the screen

770 BARABAS ET AL

ARCHIVED MATERIALS

The following materials associated with this article are retrievablefrom the Psychonomic Societyrsquos Norms Stimuli and Data archivehttpwwwpsychonomicorgarchive

To access the above files or links search the archive for this articleusing the journal (Behavior Research Methods Instruments amp Com-puters) the first authorrsquos name (Barabas) and the publication year(2004)

FILE Barabas-BRMIC-2004zipDESCRIPTION The compressed archive file contains seven files The

files areREADMEtxt containing a description of scripts and data included

in the archive (a 3K text file) This file can renamed to readmem andexecuted as a Matlab script to reproduce some of the results reported inthis manuscript

barabas2004LoPGmat a Matlab workspace file containing mag-netic tracker data collected in the verification experiments described inthis manuscript (a 32K binary file)

findcorrectioncoeffsm a Matlab function for finding the distortioncompensation coefficients for Equations 1 and 2 (a 5K Matlab M-file)

correctdistortionm a Matlab function for performing distortioncompensation on magnetic tracker data using Equations 1 and 2 (a 4KMatlab M-file)

findmodelparamsm a Matlab function for finding the parameters ofthe LoPG model from data collected during a series of directed gazes (a3K Matlab M-file)

intersecterrorm a Matlab function for finding gaze point error for aset of directed gazes (a 5K Matlab M-file)

trackereulerstomtx4m a Matlab function that computes a transfor-mation matrix representation of a relationship between coordinateframes via Equation 4 (a 3K Matlab M-file)

AUTHORrsquoS WEB SITE httpwwweriharvardedufacultypeliindexhtml

AUTHORrsquoS E-MAIL ADDRESS barabasmitedu

(Manuscript received July 18 2003revision accepted for publication July 23 2004)

TRACKING THE LINE OF PRIMARY GAZE 769

plain the difference Since calibration performed withdata from the human participant seemed less prone tothis inconsistency we suspect that the constraints im-posed on the movement of the laser and sensor by the tri-pod head may be responsible In positioning the laser toalign it with calibration targets the sensor remainedwithin 47ordm of level and ldquorollrdquo of the sensor could not beperformed These constraints lead to less variation in thecalibration data and in turn may have underconstrainedthe fitting procedure used in geometric calibration

A limitation of the geometric calibration procedure de-scribed here is the inability of the calibration procedureto locate the eye along the LoPG Although this limita-tion does not impact the ability of the system to predictpoints of gaze on the screen it does impact the veracityof the model parameters Although the geometric modelused specifies the eye as a coordinate frame with an ori-gin located some distance from the magnetic tracker sen-sor the geometric calibration procedure constrains onlythe x-axis of this coordinate frame This results in an in-finite number of possible sets of model parameters thatstill predict the same point of gaze on the screen (iescreen intersection remains the same even if the eye wereto slide forward or back along the LoPG or to rotate aboutits x-axis) This means that the parameters making uptSumlE found by the fitting procedure correspond to onepossible location of the eye along the LoPG Control ex-periments showed that the fitted location of the eye alongthe LoPG depended on the initial guess used for the fit-ting procedure even though these alternate initialguesses still yielded the same points of gaze5 If the in-termediate results of the model are to be used for exam-ple for computing the location of the eye relative to thescreen for placement of a camera in a virtual environ-ment the inability of this system to precisely locate theeye might be overcome by performing an additional cal-ibration For example a calibration similar to the oneused in this article but requiring the participant to placethe eye in other nonprimary positions of gaze could beused to find actual eye location This technique is cur-rently being investigated

REFERENCES

Allison R S Eizenman M amp Cheung B S K (1996) Combinedhead and eye tracking system for dynamic testing of the vestibularsystem IEEE Transactions on Biomedical Engineering 43 1073-1082

Asai K Osawa N Takahashi H amp Sugimoto Y Y (2000 March)Eye mark pointer in immersive projection display Paper presented atthe IEEE Virtual Reality 2000 Conference New Brunswick NJ

Atchison D A amp Smith G (2000) Optics of the human eye OxfordButterworth-Heinemann

Barabas J Giorgi R G Goldstein R B Apfelbaum H WoodsR L amp Peli E (2003 May) Wide field 3D gaze tracking systemPaper presented at the Association for Research in Vision and Oph-thalmology Meeting Fort Lauderdale FL

Blood E (1990) US Patent No 4945305 Washington DC USPatent amp Trademark Office

Bryson S (1992 February) Measurement and calibration of staticdistortion of position data from 3D trackers Paper presented at SPIEStereoscopic Displays amp Applications III San Jose

Caldwell R T Berbaum K S amp Borah J (2000) Correcting er-rors in eye-position data arising from the distortion of magnetic fieldsby display devices Behavior Research Methods Instruments ampComputers 32 572-578

Day J S Dumas G A amp Murdoch D J (1998) Evaluation of along-range transmitter for use with a magnetic tracking device in mo-tion analysis Journal of Biomechanics 31 957-961

Duchowski A Medlin E Cournia N Murphy H Gramopad-hye A Nair S Vorah J amp Melloy B (2002) 3-D eye move-ment analysis Behavior Research Methods Instruments amp Comput-ers 34 573-591

Ikits M Brederson J D Hansen C D amp Hollerbach J M(2001 March) An improved calibration framework for electromag-netic tracking devices Paper presented at the Virtual Reality 2001Proceedings IEEE Yokohama

Janin A Mizell D amp Caudell T (1993 September) Calibrationof head-mounted displays for augmented reality applications Paperpresented at the Virtual Reality Annual International SymposiumSeattle

Kindratenko V (1999) Calibration of electromagnetic tracking de-vices Virtual Reality Research Development amp Applications 4139-150

Kindratenko V (2000) A survey of electromagnetic position trackercalibration techniques Virtual Reality Research Development ampApplications 5 169-182

King P (1995) Integration of helmet-mounted displays into tacticalaircraft Proceedings of the Society for Information Display 26 663-668

Kuipers J B (1999) Quaternions and rotation sequences A primerwith applications to orbits aerospace and virtual reality PrincetonNJ Princeton University Press

Leigh R J amp Zee D S (1983) The neurology of eye movementsPhiladelphia Davis

Livingston M A amp State A (1997) Magnetic tracker calibrationfor improved augmented reality registration Presence Teleoperatorsamp Virtual Environments 6 532-546

Nixon M A McCallum B C Fright W R amp Price N B (1998)The effects of metals and interfering fields on electromagnetic track-ers Presence 7 204-218

Southard D A (1995) Viewing model for virtual environment dis-plays Journal of Electronic Imaging 4 413-420

Young L R amp Sheena D (1975) Survey of eye movement recordingmethods Behavior Research Methods amp Instrumentation 7 397-429

NOTES

1 This position is used in projected virtual reality displays to deter-mine how to position the virtual camera when generating an image of ascene

2 Even if the tracker is not aligned exactly with the case of the trans-mitter the correction polynomials (Equations 1 and 2) we used includeterms to account for such an offset These offsets are contained in theconstant (i 0) and linear (i 1) terms of polynomials

3 Two linear systems were constructed from 192 six-variable mea-surements Each of the two systems contained 192 3 576 knownsand 105 unknowns

4 It should be noted that the calibration procedure described heredoes not completely constrain the model parameters used Since no mea-surement of orientation on the screen of the point of gaze or distancefrom eye to screen during a sighting is made neither of these can be pre-dicted by the model As a result the best-fitting model parameters de-scribing tSumlE may converge on any one of a line of solutions corre-sponding to possible locations of the eye along the LoPG Each of theseeye locations predicts the same LoPG location on the screen It shouldbe possible to compute correct eye position with a calibration proce-dure that includes eye-in-head rotation

5 For this reason we have chosen to report errors in inches along thescreen surface since angular measures of error are more sensitive to lo-cation of the eye along the LoPG Where angular error is reported weestimated the eyendashscreen distance to be the distance between the mag-netic sensor and the screen

770 BARABAS ET AL

ARCHIVED MATERIALS

The following materials associated with this article are retrievablefrom the Psychonomic Societyrsquos Norms Stimuli and Data archivehttpwwwpsychonomicorgarchive

To access the above files or links search the archive for this articleusing the journal (Behavior Research Methods Instruments amp Com-puters) the first authorrsquos name (Barabas) and the publication year(2004)

FILE Barabas-BRMIC-2004zipDESCRIPTION The compressed archive file contains seven files The

files areREADMEtxt containing a description of scripts and data included

in the archive (a 3K text file) This file can renamed to readmem andexecuted as a Matlab script to reproduce some of the results reported inthis manuscript

barabas2004LoPGmat a Matlab workspace file containing mag-netic tracker data collected in the verification experiments described inthis manuscript (a 32K binary file)

findcorrectioncoeffsm a Matlab function for finding the distortioncompensation coefficients for Equations 1 and 2 (a 5K Matlab M-file)

correctdistortionm a Matlab function for performing distortioncompensation on magnetic tracker data using Equations 1 and 2 (a 4KMatlab M-file)

findmodelparamsm a Matlab function for finding the parameters ofthe LoPG model from data collected during a series of directed gazes (a3K Matlab M-file)

intersecterrorm a Matlab function for finding gaze point error for aset of directed gazes (a 5K Matlab M-file)

trackereulerstomtx4m a Matlab function that computes a transfor-mation matrix representation of a relationship between coordinateframes via Equation 4 (a 3K Matlab M-file)

AUTHORrsquoS WEB SITE httpwwweriharvardedufacultypeliindexhtml

AUTHORrsquoS E-MAIL ADDRESS barabasmitedu

(Manuscript received July 18 2003revision accepted for publication July 23 2004)

770 BARABAS ET AL

ARCHIVED MATERIALS

The following materials associated with this article are retrievablefrom the Psychonomic Societyrsquos Norms Stimuli and Data archivehttpwwwpsychonomicorgarchive

To access the above files or links search the archive for this articleusing the journal (Behavior Research Methods Instruments amp Com-puters) the first authorrsquos name (Barabas) and the publication year(2004)

FILE Barabas-BRMIC-2004zipDESCRIPTION The compressed archive file contains seven files The

files areREADMEtxt containing a description of scripts and data included

in the archive (a 3K text file) This file can renamed to readmem andexecuted as a Matlab script to reproduce some of the results reported inthis manuscript

barabas2004LoPGmat a Matlab workspace file containing mag-netic tracker data collected in the verification experiments described inthis manuscript (a 32K binary file)

findcorrectioncoeffsm a Matlab function for finding the distortioncompensation coefficients for Equations 1 and 2 (a 5K Matlab M-file)

correctdistortionm a Matlab function for performing distortioncompensation on magnetic tracker data using Equations 1 and 2 (a 4KMatlab M-file)

findmodelparamsm a Matlab function for finding the parameters ofthe LoPG model from data collected during a series of directed gazes (a3K Matlab M-file)

intersecterrorm a Matlab function for finding gaze point error for aset of directed gazes (a 5K Matlab M-file)

trackereulerstomtx4m a Matlab function that computes a transfor-mation matrix representation of a relationship between coordinateframes via Equation 4 (a 3K Matlab M-file)

AUTHORrsquoS WEB SITE httpwwweriharvardedufacultypeliindexhtml

AUTHORrsquoS E-MAIL ADDRESS barabasmitedu

(Manuscript received July 18 2003revision accepted for publication July 23 2004)