virtual environments: tracking and interaction

32
1 Virtual Environments: Tracking and Interaction Simon Julier Department of Computer Science University College London http://www.cs.ucl.ac.uk/teaching/VE Outline Problem Statement: Models of Interaction Tracking Requirements Tracking Systems: Hardware Sources of errors Interaction: Basic interaction Locomotion Selection & Manipulation Problem Statement Problem Statement: Models of Interaction Tracking Requirements Tracking Systems: Hardware Sources of errors Interaction: Basic interaction Locomotion Selection & Manipulation

Upload: others

Post on 24-Apr-2022

5 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Virtual Environments: Tracking and Interaction

1

Virtual Environments: Tracking and Interaction

Simon JulierDepartment of Computer ScienceUniversity College London

http://www.cs.ucl.ac.uk/teaching/VE

Outline

• Problem Statement:– Models of Interaction– Tracking Requirements

• Tracking Systems:– Hardware– Sources of errors

• Interaction:– Basic interaction– Locomotion – Selection & Manipulation

Problem Statement

• Problem Statement:– Models of Interaction– Tracking Requirements

• Tracking Systems:– Hardware– Sources of errors

• Interaction:– Basic interaction– Locomotion – Selection & Manipulation

Page 2: Virtual Environments: Tracking and Interaction

2

Tracking and Interaction

User

Interface Devices

Computer

User

Synthetic Environment

Real Environment

Tracking and interaction happens

here

Basic Interaction Tasks

• Locomotion or Travel– How to effect movement through the space

• Selection– How to indicate an object of interest

• Manipulation– How to move an object of interest

• Symbolic– How to enter text and other parameters

Models of Interaction

• “Extended Desktop” Model– The user needs tools to do 3D tasks

• “Virtual Reality” Model– The user is using their body as an interface to the world– The system responds to everything they do or say

Page 3: Virtual Environments: Tracking and Interaction

3

Extended Desktop Model

• Focus on analysing a task and creating devices that fit the task

• Study ergonomics of the device and applicability/suitability for the role

Limits of ED Model

• 3D tasks are quite complicated to perform

• Tasks can become very specialised

• Leads to a proliferation of real (and virtual) devices

Fakespace Cubic Mouse

Types of Device

Ascension Wanda

3DConnexion Spaceball

Polhemus Isotrak 3-Ball

Logitech 3D Mouse

3DConnexion Spacemouse

Inition 3DiStick

Page 4: Virtual Environments: Tracking and Interaction

4

Virtual Reality Model

• Need to track the user precisely and interpret what they do

• Focus is on users exploring the environment

• Tension between magical and mundane responses of the environment– Mundane are where the world responds as

if it was controlled by laws of physics– Magical are everything else (casting spells,

automatic doors, etc…)

Limits of VR Model

• Can’t track user over very large areas– E.g. Some form of locomotion

metaphor will be required for long distance travel (see later)

• Physical constraints of systems• Limited precision and tracking

points• Lack of physical force feedback

Tracking System

• Problem Statement:– Models of Interaction– Tracking Requirements

• Tracking Systems:– Hardware– Sources of errors

• Interaction:– Basic interaction– Locomotion – Selection & Manipulation

Page 5: Virtual Environments: Tracking and Interaction

5

Connection Between Interaction and Tracking

• Irrespective of interaction model, user must be instrumented in some way to convey information to the system

• This is carried out using the tracking system

Requirements for Trackers

• Resolution– Be able to detect small changes in the system

• Accuracy– The size of the range of the correct positions reported by the

system

• Sample Rate– The frequency the sensors are checked for new data. Sampling

rate must be greater than the data rate

• Data Rate– The no. of computed position/sec, the higher the rate, the more

desirable the system will be.

Requirements for Trackers

• Update rate– The rate new positions are reported to the host computer

• Lag– the delay between the new movement made and the new position

reported

• Range of operations– The area/range/volume in which the tracker can accurately report

the positions. E.g., the distance, the height. This is determined by the wire length, signal strength, etc.

Page 6: Virtual Environments: Tracking and Interaction

6

Requirements for Trackers

• Robustness– The ability the tracker can cope with the amount of uncertainty and

noise. (e.g. water, metal, keys)

• Fitness for tracking multiple objects– Ability to independently determine the positions of multiple objects.

This is determined by the design of the system architecture.– Ability to cope with alteration caused by the one remote object onto

the other. For example, if one sensor is occluded by another sensor.

Types of Tracking Technology

• Many types of tracker are available– From ultrasonic, consumer devices ($10s) through to

very precise mechanical trackers ($100,000s)– Not all trackers are suited to all applications

• E.G. mechanical trackers aren’t that suitable for CAVEs since you see the device

– Cost is still a big problem if you want to track at a fine enough scale for head-tracked virtual reality

The Ideal Tracker

Magical, ideal tracker would have these characteristics:

• Tiny (transistor size)• Self-Contained• Complete (6 DoF)• Accurate (1mm position, 0.1 degree orientation)• Fast (1000Hz, <1ms latency)• Immune to occlusions (no line-of-sight requirement)• Robust (no interference)• No range limitation• Cheap

Page 7: Virtual Environments: Tracking and Interaction

7

Tracking Technologies• 5 main types: mechanical, inertial, acoustic, optical,

magnetic. • Most can be classed as:

• Outside-In: user emits signal to indicate its location to the system

• Inside-Out: systems emits signal to user which senses location

Outside-in Inside-out

Mechanical Trackers

• First & simplest systems

• Use prior knowledge or rigid mechanical pieces and measurements from sensors.

• Typically boom-type tracked displays with counterweights.

Mechanical Trackers

• Some example systems

Page 8: Virtual Environments: Tracking and Interaction

8

Mechanical Trackers

• Pros– Accurate– Low latency– Force-feedback– No Line of Sight or Magnetic Interference Problems

• Cons– Large & cumbersome– Limited range

Inertial Trackers

• 3 linear accelerometers measure acceleration vector• Rotated using current rotation matrix (orientation)

determine by gyroscopes

Inertial Trackers

• Pros– Small (chip form), self-contained.– Immune to occlusions– No interference– Low latency (typically <2ms)– High sample rate

• Cons– Drift is the show stopper– Accelerometer bias of 1 milli-g 4.5m drift after 30s– Close, but no silver-bullet

• High potential as part of hybrid systems…

Page 9: Virtual Environments: Tracking and Interaction

9

Acoustic Trackers

• Uses sound waves for transmission and sensing

• Involves pulses at intervals• SONAR is best known,

determining time of a pulse• Uses ultrasound• Outside-In (microphone

sensors)• (Logitech Acoustic Tracker)• (Samba De Amigo Maracas)

Acoustic Trackers

• Pros– Very small so can be worn– Line of sight less of an issue than with optical systems– Better range than mechanical systems

• Cons– Size proportional to range– Environment considerations (temperature, humidity)– Acoustic issues can cause slow update rate (10Hz) (5-100ms)– Attenuation at desirable high frequencies (reduced interference)– Jingling of keys

Magnetic Trackers

• Measures changes in the magnetic field• Can be done by magnetometers (for

DC)• Or by induced current in an

electromagnetic field (for AC)• 3 sensors orthogonally arranged will

produce a 3D vector• In tracking, a multi-coil source unit with

each coil energised (excited) and when measured results in position and orientation.

• Compass: uses the earth’s naturally occurring DC magnetic field to determine heading, can be used here

• (Ascension spacePad)

Page 10: Virtual Environments: Tracking and Interaction

10

Magnetic Trackers

• Pros– User-worn component small– No line of sight issues (magnetic fields go through us)– One source unit can excite many sensor units– Very low latency (~5ms)– Ability to track multiple users using a single source unit

• Cons– Field distortions (foreign objects, natural magnetic field)

• Requires some compensation– Jingling of keys (or anything magnetically conductive)– Need to wait for energised excitation of coil to subside before the

next one so update is slow– Jitter increases over distance from emitter/sensor

Optical Trackers

• Measures reflected or emitted light• Involves a source (active or passive) and

sensor• Sensors can be analogue or digital• Photo sensing (light intensity) or Image

forming (CCD)• Triangulation with multiple sensors• Possible to be both outside-in and inside-

out

Optical Trackers

• Pros– Analogue sensors with active light source gives high update and

spatial precision– Passive with image-forming sensors could be used in an

unaffected environment– Image forming sensors provide closed-loop feedback of real

environment and tracker

• Cons– Line of sight is critical– Target’s orientation harder to determine

Page 11: Virtual Environments: Tracking and Interaction

11

Hybrid Trackers

• No single solution that suits all applications

– Many different approaches, each with advantages and limitations

– Can address the limitations by building hybrid systems which combine the advantages of each approach

• Inertial sensors have provided the basis for several successful hybrid systems due to their advantages

• Example, the VisTracker users an opto-inertial hybrid

Hybrid Tracking Algorithms

• Hybrid tracking is an example of a data fusionalgorithm:– Information from a set of disparate modalities– Fused together to provide consistent estimate

• Most common implementation is to use a Kalman filter

Kalman Filtering

• The Kalman filter is a recursive minimum mean squared error estimator

• It uses a predict-update cycle:

• This makes it possible to combine lots of types of information in an asynchronous manner

Initialize Predict Update

Page 12: Virtual Environments: Tracking and Interaction

12

Fusing Multiple Measurements

t

Camera

t+50ms

Camera

Inertial

Update UpdatePredict

t+100ms

Prediction (using motion model)

Predict

Hybrid Trackers

• InterSense IS-900– Tracking system for VR-Walkthrough

applications– Inertial (orientation & position) &

Ultrasonic (drift correction) hybrid tracker which has highly accurate 6 degree of freedom tracking in a wide area.

– Features fast updates, low latency, filtering to reduce jitter and advanced prediction algorithms to reduce latency very smooth and precise

– The four sensors, including a head tracker, a hand tracker, a wand (with four buttons and an integrated joystick), and a stylus (with two buttons).

Tracking Errors

• Static Tracked Object– Misalignment– Spatial Distortion (inaccuracy)– Spatial Jitter (noise)– Creep

• Dynamic Tracked Object– Lag (time delay, tracker + subsystems complex relation)– Latency Jitter (variations in latency)– Dynamic Errors (other inaccuracies, e.g. prediction algorithms)

Page 13: Virtual Environments: Tracking and Interaction

13

Tracking Errors of < 1 Degree Noticable

Misalignment

M

S

W

B

WM

WB

BS

SM• Referentials:– W: world– B: base (referential) of tracker– S: sensor of tracker– M: display (manipulator)

• Transformation (pose) AB: – Transformation that modify the

referential A into B – Pose of B with respect to A– 4x4 Homogeneous

transformation matrix – AB = (BA)-1 and AB = AC.CB

WM = WB.BS.SM

Spatial Distortion

• Repeatable errors at different poses in the tracking volumes

• Many factors including incorrect calibration and persistent environmental disturbances

Page 14: Virtual Environments: Tracking and Interaction

14

Spatial Jitter

• These are caused by noises in the sensor

• Even with same noise on sensors, the jitter on pose estimates can change with the pose

• Hybrid sensors can improve the performance

“A General Method for Comparing the Expected Performance of Tracking and Motion Capture Systems” - Dannett Allen, Greg Welch

Creep

• Slow but steady changes in tracker output over time

• Caused by temperature drift or other similar “start up”transients in a system

Measurements from stationary gyro“Evaluation of a Solid State

Gyroscope for Robotics Applications”- Barshan and Durrant-Whyte

System Latency

• Mine, M. Characterization of end-to-end delays in head-mounted displays. Tech. Rep. TR93-001, Department of Computer Science, The University of North Carolina, 1993.

• Definition: End to end delay – Total time required for image displayed by HMD to

change in response to the movement of the user’s head.

Page 15: Virtual Environments: Tracking and Interaction

15

Delays in HMD Pipeline

• Tracking system comprises– Physical sensing– Filtering on tracking device– Transmission delays (RS232, Ethernet, etc.)

• Application delay– Collision detection, interaction events, etc.

• Image generation– At roughly the display refresh rate

• Display system– Time taken to transfer and display an image

from Mine (1993)

Measuring delay

• Mine constructed a system to measure delay in HMD systems– Measurement at several points in pipeline.

Tracking Application Image generation Display

Tstart Treport Tdisplay Tdisplay+17ms

Page 16: Virtual Environments: Tracking and Interaction

16

Measuring delay

from Mine (1993)

Results

• Tracking delays– Best had delays ~10ms.– Worst, delays of ~60ms.– More tracked units implies longer delay

• Application/Image generation– 55ms on average.– Although application delay was minimal.

• Display system delay– NTSC has delay of 16.67ms.

Tracking Summary

• Quite a complex and challenging problem– No real ideal solution

• Several tracking technologies exist with different levels of suitability based on the application in question. All of the technologies display both pros and cons. – The ultimate tracker will probably not be developed from a single

technology, but as a hybrid of these technologies.

• A VR application should provide the following:– High data rates for accurate mapping without lag– High tolerance to environmentally induced errors– Consistent registration between physical and virtual environments– Good sociability so that multiple users can move freely

Page 17: Virtual Environments: Tracking and Interaction

17

Interaction

• Problem Statement:– Models of Interaction– Tracking Requirements

• Tracking Systems:– Hardware– Sources of errors

• Interaction:– Basic interaction– Locomotion – Selection & Manipulation

Basic Interaction Tasks

• Locomotion or Travel– How to effect movement through the space

• Selection– How to indicate an object of interest

• Manipulation– How to move an object of interest

• Symbolic– How to enter text and other parameters

Direct Locomotion

• User walks from one part of the environment to another

• Intuitive, easy to use• Requires a great deal

of space

Page 18: Virtual Environments: Tracking and Interaction

18

Constrained Walking

• User walks but motion is constrained– VirtuSphere– Treadmills

• However, most forms can be very difficult to use– Mismatch in perceptual cues– Dynamics / inertia of device

make it hard to navigate effectively

CirculaFloor

• Floor consists of a set of movable tiles

• As the user walks forwards, tiles move in front of the user’s feet to allow near infinite movement

CirculaFloor

QuickTime™ and aYUV420 codec decompressor

are needed to see this picture.

Page 19: Virtual Environments: Tracking and Interaction

19

Walking-in-Place

• User “walks in place”• Movement detected by

gait analysis• No perceptual

mismatch

Redirected Walking in the CAVE

• Problems with walking in the CAVE:– You eventually hit the walls– You can turn and see the

missing back wall• One means of countering

this is to rotate the environment– The user is directed back to

the front wall

Redirected Walking in the CAVE

• Apply a small rotation to the scene to cause user to turn towards centre– Sufficiently small that not consciously

noticed– Subject responds to maintain balance

• Increase rate when user is navigating or rapidly turning head

• Results:– Variance in number of times user saw back

wall decreased– Rates of simulator sickness were not

increased– Some users did not notice the rotation

Page 20: Virtual Environments: Tracking and Interaction

20

Basic Interaction Tasks

• Locomotion or Travel– How to effect movement through the space

• Selection– How to indicate an object of interest

• Manipulation– How to move an object of interest

• Symbolic– How to enter text and other parameters

Locomotion

• User points (somehow) in the direction of motion

• User presses a button

Selection and Manipulation

• User points at object with their hand

• User selects by pressing a button

• User grabs by pressing 2nd button– Object is rigidly

attached to hand coordinate system

Page 21: Virtual Environments: Tracking and Interaction

21

Selection Only

• Occlusion selection

• Similar to selection with a mouse

• Put hand over object (occlude it) to select it

Locomotion

• Travel in Immersive Virtual Environments: An Evaluation of Viewpoint Motion Control Techniques, Bowman, Koller and Hodges

• One of the first rigorous studies of some of the trade-offs between different travel techniques

Taxonomy of Travel

Bowman, Koller and Hodges

Page 22: Virtual Environments: Tracking and Interaction

22

Quality Factors

• 1. Speed (appropriate velocity)• 2. Accuracy (proximity to the desired target)• 3. Spatial Awareness (the user’s implicit knowledge of his position and

orientation within the environment during and after travel)• 4. Ease of Learning (the ability of a novice user to use the technique)• 5. Ease of Use (the complexity or cognitive load of the technique from

the user’s point of view)• 6. Information Gathering (the user’s ability to actively obtain

information from the environment during travel)• 7. Presence (the user’s sense of immersion or “being within” the

environment)

Experiment 1

• Absolute motion task– Gaze v. Point AND constrained v. unconstrained

• Note the immediate trade-offs with point and gaze– Bowman claimed expected gaze to be better

• Neck muscles are more stable• More immediate feedback

• Eight subjects, each doing four times 80 trials (five times 4 distances to target, four target sizes)

Experiment 1

• No difference between techniques

• Significant factors were target distance and size

Bowman, Koller and Hodges

Page 23: Virtual Environments: Tracking and Interaction

23

Experiment 1

• No difference between techniques

• Significant factors were target distance and size

Bowman, Koller and Hodges

Experiment 2

• Relative motion task• No prior expectation

– Though there is an obvious one

• Need forward and reverse direction

• Nine subjects, four sets of 20 trials

Bowman, Koller and Hodges

Experiment 2

• Obvious difference• Can’t point at target

and look departure point simultaneously

Bowman, Koller and Hodges

Page 24: Virtual Environments: Tracking and Interaction

24

Summary of 1st Two Experiments

Bowman, Koller and Hodges

Experiment 3

• Testing spatial awareness based on four travel variations– Constant speed (slow)– Constant speed (fast)– Variable speed (smooth acceleration)– Jump (instant translation)

• Concern is that jumps and other fast transitions confuse users

Experiment 3

• However, there was no main effect

• This is still worth further study

Bowman, Koller and Hodges

Page 25: Virtual Environments: Tracking and Interaction

25

Other Locomotion Techniques

• Direct walking• Constrained movement• Redirected walking

Selection and Manipulation

• Moving Objects In Space: Exploiting Proprioception In Virtual-Environment Interaction, Mine, Brooks Jr. and Sequin

• One of the first papers to discuss a range of selection and manipulation tasks

Body-Relative Interaction

• Provides– Physical frame of reference in which to work– More direct and precise sense of control– “Eyes off” interaction

• Enables– Direct object manipulation (for sense of position of object)– Physical Mnemonics (objects fixed relative to body)– Gestural Actions (invoking commands)

Page 26: Virtual Environments: Tracking and Interaction

26

Working within Arms Reach

• Takes advantage of proprioception• Provides direct mapping between hand motion and object

motion• Provides finer angular precision of motion

Ray-Based Interaction

• Ray-Based– Ray is centred on user’s

hand– All manipulations are relative

to hand motion• Translation in beam

direction is hard• Rotation in local object

coordinates is nearly impossible

Mark Mine, http://www.cs.unc.edu/~mine/isaac.html

Object-Centred Interaction

• Object-Centred– Select with ray as

before• Local movements of

hand are copied to object local coordinates

Mark Mine, http://www.cs.unc.edu/~mine/isaac.html

Page 27: Virtual Environments: Tracking and Interaction

27

Go-Go Hand Interaction

• Arm stretches to reach object

• Amplifies local movements

Stretch Go-Go Hand Technique, Bowman & Hodges, based on Go-Go Hand from Pouyrev,

Billinghurst, Weghorst, Ichikawa

World in Miniature (WIM) Interaction

• Smaller version of the world created and superimposed on the real world

• User controls WIM using hanheld ball

• Can interact with environment by selecting 1:1 scale or same object on WIM

World in Miniature, Stoakley and Pausch

Scaled-World Grab

• Automatically scale world, so that selected object is within arms reach– Near and far objects easily moved– user doesn’t always notice scaling– dramatic effects with slight head movement

Page 28: Virtual Environments: Tracking and Interaction

28

Mine, Brooks Jr, Sequin

Scaled-World Grab for Locomotion

• User transports himself by grabbing an object in the desired travel direction and pulling himself towards it

• User can view the point of interest from all sides very simply

• For exploration of nearby objects , virtual walking is more suitable; while going much further, invoking a separate scaling operation or switch to an alternate movement mode is better

Physical Mnemonics

• Storing of virtual objects and controls relative to user’s body1.Pull-down menus2.Hand-held widgets3.Field of View-Relative mode

switching

Page 29: Virtual Environments: Tracking and Interaction

29

Pull-Down Menus

• Problems with virtual menus– Heads-up are difficult to manage– Fixed in world often get lost

• Could enable menu with ..– Virtual button (too small) – Physical button (low acceptability)

• Instead “hide” menus around the body, e.g. above FOV

Hand-Held Widgets

• Hold controls in hands, rather than on objects

• User relative motion of hands to effect widget changes

Mine, Brooks Jr, Sequin

FOV-Relative Mode Switching

• Change behaviour depending on whether a limb is visible– Hand visible, use occlusion selection– Hand not visible, use ray selection

Page 30: Virtual Environments: Tracking and Interaction

30

Gestural Actions

• Head – butt zoom• Look – at Menus• Two – handed flying• Over – the – shoulder

deletion

Mine, Brooks Jr, Sequin

Experiment 1

• Align docking cube with target cube as quickly as possible

• Comparing three manipulation techniques– Object in hand– Object at fixed distance– Object at variable distance (scaled by arm extension)

Experiment 1

• 18 subjects• In hand was significantly faster

Mine, Brooks Jr, Sequin

Page 31: Virtual Environments: Tracking and Interaction

31

Experiment 2

• Virtual widget comparison• Comparing

– Widget in hand– Widget fixed in space

• 18 subjects (as before)• Performance measured by accuracy not time

Experiment 2

• Widget in hand was significantly better

Mine, Brooks Jr, Sequin

Putting it All Together

QuickTime™ and a decompressor

are needed to see this picture.

Page 32: Virtual Environments: Tracking and Interaction

32

Summary

• Tracking systems provide a way to model the user (VR model) or provide direct input to control system (EM model)

• A lot of work has been done and is being done in 3D interaction– Covered locomotion and selection & manipulation

• However it is still quite tedious to use most 3D user interfaces– Lack of precision is probably main problem

• However, people are able to interact