the rit visual perception · pdf filethe rit visual perception laboratory ... coupled with a...

10
The RIT Visual Perception Laboratory Members of RIT’s Visual Perception Laboratory (VPL), housed in the Chester F. Carlson Center for Imaging Science, make use of state-of-the-art instrumentation to monitor and analyze observers’ gaze patterns as they perform a wide range of tasks. Head-mounted, video-based eyetrackers manufactured by Applied Science Laboratories and ISCAN are used to track participants’ gaze patterns in the laboratory. Coupled with a magnetic-field head tracking system, it is possible to monitor observers’ point of regard in image coordinates without the need to restrain the head. The Visual Perception Lab is also equipped with a ‘remote’ eye- imaging camera that allows gaze patterns to be monitored without head-mounted instrumentation. The VPL is equipped with a range of display devices including a Sony High-resolution CRT, Apple 22” Cinema Display, Sharp 37” Aquos LCTV, and a Pioneer 503CMX 50” plasma display. Computing resources include Silicon Graphics, PPC & Intel- based Macintosh, and PC workstations. In addition to the commercially available eyetracking systems, RIT’s Visual Perception Laboratory continues to innovate in the design, fabrication, and application of novel eyetracking systems. The 3 rd -Generation RIT Wearable Eyetracker is a unique device that is capable of monitoring observers’ gaze while engaged in active behaviors ranging from watching a classroom lecture to playing racquetball. Current research in RIT’s Visual Perception Laboratory is funded by the National Science Foundation, the Naval Research Laboratory; the New York State office of Science, Technology, and Academic Research’s Center for Electronic Imaging Systems; and industrial sponsors. Research is conducted by graduate and undergraduate students under the direction of faculty from the Center for Imaging Science in the College of Science and the Dept. of Psychology in the College of Liberal Arts. Key personnel Jeff Pelz, Associate Professor, Imaging Science (www.cis.rit.edu/pelz ) Andrew Herbert, Assistant Professor, Psychology (www.rit.edu/~amhgss ) Mitch Rosen, Research Professor, Imaging Science (www.cis.rit.edu/rosen ) Graduate Students: Feng Li & Susan Kolakowski Undergraduate Students: Chris Louten & Tony Mok Imaging Science Summer High-School Interns (2 interns, TBD)

Upload: letuyen

Post on 21-Mar-2018

220 views

Category:

Documents


5 download

TRANSCRIPT

Page 1: The RIT Visual Perception · PDF fileThe RIT Visual Perception Laboratory ... Coupled with a magnetic-field head ... monitoring the gaze of observers as they look at the image pairs

The RIT Visual Perception Laboratory

Members of RIT’s Visual Perception Laboratory (VPL), housed in the Chester F. Carlson Center for Imaging Science, make use of state-of-the-art instrumentation to monitor and analyze observers’ gaze patterns as they perform a wide range of tasks. Head-mounted, video-based eyetrackers manufactured by Applied Science Laboratories and ISCAN are used to track participants’ gaze patterns in the laboratory. Coupled with a magnetic-field head tracking system, it is possible to monitor observers’ point of regard in

image coordinates without the need to restrain the head. The Visual Perception Lab is also equipped with a ‘remote’ eye-imaging camera that allows gaze patterns to be monitored without head-mounted instrumentation. The VPL is equipped with a range of display devices including a Sony High-resolution CRT, Apple 22” Cinema Display, Sharp 37” Aquos LCTV, and a Pioneer 503CMX 50” plasma display. Computing resources include Silicon Graphics, PPC & Intel-based Macintosh, and PC workstations.

In addition to the commercially available eyetracking systems, RIT’s Visual Perception Laboratory continues to innovate in the design, fabrication, and application of novel eyetracking systems. The 3rd-Generation RIT Wearable Eyetracker is a unique device that is capable of monitoring observers’ gaze while engaged in active behaviors ranging from watching a classroom lecture to playing racquetball.

Current research in RIT’s Visual Perception Laboratory is funded by the National Science Foundation, the Naval Research Laboratory; the New York State office of Science, Technology, and Academic Research’s Center for Electronic Imaging Systems; and industrial sponsors. Research is conducted by graduate and undergraduate students under the direction of faculty from the Center for Imaging Science in the College of Science and the Dept. of Psychology in the College of Liberal Arts. Key personnel

Jeff Pelz, Associate Professor, Imaging Science (www.cis.rit.edu/pelz) Andrew Herbert, Assistant Professor, Psychology (www.rit.edu/~amhgss) Mitch Rosen, Research Professor, Imaging Science (www.cis.rit.edu/rosen)

Graduate Students: Feng Li & Susan Kolakowski Undergraduate Students: Chris Louten & Tony Mok

Imaging Science Summer High-School Interns (2 interns, TBD)

Page 2: The RIT Visual Perception · PDF fileThe RIT Visual Perception Laboratory ... Coupled with a magnetic-field head ... monitoring the gaze of observers as they look at the image pairs

Research Projects: Interns may be involved in several projects during Summer ’06. In each case, we will be applying eyetracking systems to study issues in high-level (i.e., cognitive vs. physiological) visual perception. 1. Change blindness Despite the seeming ease with which we perceive the world, visual perception is a complex process that unfolds over time. Changes between image pairs viewed in sequence are difficult to detect if a transient (such as a briefly presented blank screen or eye movement) is present between the images. This phenomenon has been termed change blindness (see Simons & Rensink, 2005 for a recent review), and has proven a powerful tool for exploring the nature of the memory trace available to conscious perception. One reason the phenomenon is so compelling is that it is counter to our conscious (but illusory) experience of a rich, detailed representation of the visual world. It also conflicts with the demonstrated ability of human beings to recognize thousands of images days or weeks after only brief presentations (e.g., Standing et al., 1970). However, change blindness experiments have revealed the minimal memory trace retained between views. A simple example, adapted for print, is shown below. Images 1a and 2a are shown on this page; images 1b and 2b are shown on the following page. Find the difference between each a/b pair by flipping between the two pages. We will be doing research this summer to test the hypothesis that ‘implicit change detection’ occurs before an observer is able to consciously report the difference between two images. By monitoring the gaze of observers as they look at the image pairs on a 50” plasma display, we will look for evidence that attention is drawn to the area near the change. (E.g., the shadow of the golfer on the left, and the center post next to the car.)

Image 1 a) Image 2 a) We will also be piloting a new experiment in which observers attempt to identify changes between a real environment and photographs of the environment. For example, they might be asked to examine a room and compare it to a previously viewed image of the room.

Page 3: The RIT Visual Perception · PDF fileThe RIT Visual Perception Laboratory ... Coupled with a magnetic-field head ... monitoring the gaze of observers as they look at the image pairs

2. LCD TV Motion Perception Liquid Crystal Displays (LCDs) are rapidly gaining market share as computer displays, and are poised to do the same in the consumer television market. The temporal response of LCDs, however, is slow compared to traditional cathode-ray tube (CRT) displays. Unlike CRTs, where image display is a function of guided electron beams and phosphors with very rapid rise & decay rates, changing the color and/or brightness of any single pixel in a Liquid Crystal Television (LCTV) requires physical movement of the liquid crystal material in the red, green, and blue sub-pixel cavities. While the technology has advanced rapidly, image changes are still relatively slow compared to traditional CRT TVs. LCTV manufacturers are integrating real-time image processing into displays to try to mitigate the slow response. Optimal design of the algorithms requires more than an understanding of the dynamics of the display hardware; we also need to understand the behavior of observers as they watch moving images. The second project involves monitoring the eye movements of observers as they watch a range of motion sequences on CRT and LCTV displays. Sequences will include everything from small dots against a uniform background to complex motion sequences. Analysis of the eye movement records is complex and time consuming. Typical Day: Interns will be involved in a wide range of activities, including serving as observers in a range of visual tasks, and helping in the design, execution, and analysis of experiments. Useful Skills/Talents/Interests: Work in the Visual Perception Laboratory is interdisciplinary. Interns will work with faculty and students (graduate and undergraduate) from the Center for Imaging Science and the Department of Psychology, and with interns in the Imaging Science Summer High-School Intern program. Successful applicants might have interests ranging from systems design to visual perception.

Image 1 b) Image 2 b) Instrumentation: Following is some information on some of the instrumentation used in the VPL, from off-the-shelf eyetrackers, CRT, LCD, and plasma displays, to systems designed and built at RIT.

Page 4: The RIT Visual Perception · PDF fileThe RIT Visual Perception Laboratory ... Coupled with a magnetic-field head ... monitoring the gaze of observers as they look at the image pairs

Laboratory-based eyetrackers:

Figure 1 ASL Series 500 eyetracker system (left) and Model 501 headband-mounted optics (right)

The Applied Science Laboratories (ASL) Series 500 infrared, video-based eyetracker shown in Figure 1 is used with the Model 501 headband-mounted optics also seen in the Figure. The ASL provides a 60 – 240 Hz data stream indicating the observer’s point of regard in eye-in-head coordinates, and a 60 Hz video record of the viewer’s point-of-regard superimposed over the field of view (see Figure 2).

Figure 2

Eye/Head Gaze Integration: The data stream from the eyetracker provides eye position in a head-based reference frame. To determine the point of gaze in world coordinates, the position and orientation of the head are also tracked. Using the Polhemus Fastrak magnetic-field head tracking (MHT) system shown in Figure 3, it is possible to monitor an observer’s effective gaze point in world coordinates. The eye-in-head orientation, and head position and orientation are integrated to provide the final gaze vector. The point of regard in image coordinates can be determined when the position and orientation of the display is known with respect to the MHT transmitter.

Page 5: The RIT Visual Perception · PDF fileThe RIT Visual Perception Laboratory ... Coupled with a magnetic-field head ... monitoring the gaze of observers as they look at the image pairs

Figure 3 Polhemus Fastrak magnetic-field head tracker

ASL Series 504 Remote Eyetracker The Visual Perception Lab is also equipped with ‘remote’ eye-imaging camera that allows eye movements to be monitored without the need for head-mounted hardware. The ASL Model 504 60 Hz remote optics system shown in Figure 4 illuminates and images the eye from below the image display. The observer’s head must remain relatively still, but the remote optics system eliminates the need to integrate eye and head position, and can be calibrated to provide gaze position in image coordinates. Both systems provide an accuracy of approximately ± ½º.

Figure 4 Model 504 “remote optics”

Page 6: The RIT Visual Perception · PDF fileThe RIT Visual Perception Laboratory ... Coupled with a magnetic-field head ... monitoring the gaze of observers as they look at the image pairs

Image Displays: Image display devices available in the Visual Perception Laboratory include an Apple Cinema LCD Display, a Pioneer 503CMX 50” plasma display, and a Sony G420 high-performance CRT.

Figure 5

RIT Wearable Eyetrackers In addition to the commercially available eyetracking systems, RIT’s Visual Perception Laboratory continues to innovate in the design, fabrication, and application of novel eyetracking systems. The 2nd-generation RIT Wearable Eyetracker, shown in Figure 6 and Figure 7, provides a valuable new tool for investigating visual perception in a range of natural tasks outside the laboratory.

Figure 6

Page 7: The RIT Visual Perception · PDF fileThe RIT Visual Perception Laboratory ... Coupled with a magnetic-field head ... monitoring the gaze of observers as they look at the image pairs

Figure 7

The wearable system can be used in the laboratory as well – it has proven to be a valuable tool in human factors research. The 3rd-generation RIT Tracker, shown in Figure 8, is still smaller and lighter and has opened up an even broader range of natural tasks to investigation, from driving to sports such as squash (see Figure 9).

Figure 8

Figure 9

Page 8: The RIT Visual Perception · PDF fileThe RIT Visual Perception Laboratory ... Coupled with a magnetic-field head ... monitoring the gaze of observers as they look at the image pairs

The 3rd-generation tracker is designed for off-line analysis. Raw video of the eye and scene are captured using the backpack or portable systems shown in Figure 8, while calibration and analysis are carried out later in the laboratory, making data collection ‘in the field’ more efficient.

Figure 10 Off-line analysis

Visualization: A number of visualization tools are available for representing observers’ gaze patterns. In addition to the video record showing gaze pattern over time, the eye movement patterns can be indicated in image overlays, as shown in Figure 11, or reported in image coordinates, as in Figure 12.

Figure 11

Page 9: The RIT Visual Perception · PDF fileThe RIT Visual Perception Laboratory ... Coupled with a magnetic-field head ... monitoring the gaze of observers as they look at the image pairs

Figure 12

The gaze patterns can also be represented by indicating the concentration of visual attention in two or three dimensions, as shown in Figure 13.

Figure 13

Page 10: The RIT Visual Perception · PDF fileThe RIT Visual Perception Laboratory ... Coupled with a magnetic-field head ... monitoring the gaze of observers as they look at the image pairs

It is often desirable to compare different observers’ gaze behavior as they perform the same task. Figure 14 illustrates the first four fixations of ten observers completing a visual-search task.

first fixation second fixation

third fixation fourth fixation

Figure 14

Members of the Visual Perception Laboratory at the Rochester Institute of Technology make use of these state-of-the-art instrumentation and visualization methods to bring a better understanding of observer behavior in a wide range of visual perceptual tasks. More information is available at www.cis.rit.edu/vpl/ and www.cis.rit.edu/pelz/ Jeff B. Pelz. Associate Professor Chester F. Carlson Center for Imaging Science College of Science Rochester Institute of Technology 54 Lomb Memorial Drive Rochester, NY 14623 [email protected] 585 475-2783