vision for mobile robot navigation: a survey - rugfeldbrug/cogrobot/cr_j_zondag.pdf · vision for...

39
Vision for Mobile Robot Navigation: A Survey (February 2002) Guilherme N. DeSouza & Avinash C. Kak presentation by: Job Zondag 27 February 2009

Upload: doannhi

Post on 24-Mar-2018

215 views

Category:

Documents


2 download

TRANSCRIPT

Vision for Mobile Robot Navigation:

A Survey

(February 2002)Guilherme N. DeSouza & Avinash C. Kak

presentation by: Job Zondag27 February 2009

Outline: Types of Navigation

Navigation

Indoor

Outdoor

(Structured)Map-Based

Map-Building

Mapless(Unstructured)

Structured

Unstructured

Absolute localization

Incremental localization

Optic flow

Appearance based

Object recognition

Landmark tracking

Outline: Types of Navigation

Navigation

Indoor

Outdoor

(Structured)Map-Based

Map-Building

Mapless(Unstructured)

Structured

Unstructured

Absolute localization

Incremental localization

Optic flow

Appearance based

Object recognition

Landmark tracking

Indoor Navigation:

Map-Based Navigation Vision system needs incorporation of some knowledge of what

the robot is supposed to see– CAD (geometrical maps)– occupancy maps– VFF: Virtual Force Fields– topological maps– sequences of images

Vision based localization steps:– Acquire sensory information– Detect landmarks– Establish matches between observation and expectation– Calculate position

Indoor Navigation:

Map-Based Navigation Absolute or global localization: Robot's initial pose is

unknown. Incremental localization: Robot's initial pose is

proximately known. Goal is to refine the location coordinates.

Landmark tracking: Keep track of landmarks in the consecutive images that are recorded as the robot moves.

Absolute or Global Localization

Atiya and Hager (1993)

Incremental Localization:

Geometrical Representation of Space Initial position known proximately Keep updating the (uncertainties in the) position

of the robot FINALE Kosaka & Kak (1992)

– Geometrical representation of space– Statistical model of uncertainty in the location of the

robot (Gaussian distribution)

Incremental Localization:

Geometrical Representation of Space Using Geometrical Representation of Space Propagation of Positional Uncertainty trough

Commanded Motions

Incremental Localization:

Geometrical Representation of Space Projecting Robot's Positional Uncertainty into

Camera Image Kalman Filtering

Incremental Localization:

Topological Representation of Space NEURO-NAV Meng

& Kak (1992) Graph representation

of the layout of the hallway

2 modules (using neural networks) Hallway Follower Landmark Detector

Supervisory Rule-Based Controller

Incremental Localization:

Topological Representation of Space

Incremental Localization:

Topological Representation of Space• Corridor-following:

• Neural Networks trained using backpropagation when a Human Supervisior module takes control of the navigation

• Results (1993):

– 86 % correct steering

– 10 % incorrect steering

– 4 % no decision

• FUZZY-NAV

– Kak et al. (1995)

Landmark Tracking

• Possible when known:– Approximate location of

the robot

– Identity of the landmarks

• Landmarks– Artificial (circles,

barcodes, tape)

– Natural (doors, windows, trees etc.)

• Most often: template matching

Outline: Types of Navigation

Navigation

Indoor

Outdoor

(Structured)Map-Based

Map-Building

Mapless(Unstructured)

Structured

Unstructured

Absolute localization

Incremental localization

Optic flow

Appearance based

Object recognition

Landmark tracking

Map-Building

Model of the world not always easy to generate

First attempt: Moravec (1981) Stanford Cart– World representation: 3D

features plotted in a grid of 2 m2 cells

– 20 meters in 5 hours Moravec & Elfes (1985):

– occupancy grid

Map-Building

Occupancy-grid-based approaches: cells with a probability of being occupied

Map-Building

Occupancy-grid-based approaches: cells with a probability of being occupied

− Rich in geometrical detail− Reliability depends on accuracy of the robot's

odometry and sensor uncertainties − Not computationally efficient for large or

complex spaces Topological approaches

− Difficult to recognize previously visited nodes

Outline: Types of Navigation

Navigation

Indoor

Outdoor

(Structured)Map-Based

Map-Building

Mapless(Unstructured)

Structured

Unstructured

Absolute localization

Incremental localization

Optic flow

Appearance based

Object recognition

Landmark tracking

Mapless Navigation: Optical Flow

Mapless Navigation: Optical Flow

• Santos-Victor et al. (1993)– Robot: Robee– Mimics visual behavior of bees:

centering reflex (when flying trough hallway)

– Lateral position of the eyes: Motion derived features in stead of depth information

Mapless Navigation: Optical Flow

• Sustained behavior: it is desirable that when the robot runs into a section of the corridor deficit in wall texture, the robot drives on.

Mapless Navigation:

Appearance-Based Matching• Store images or templates of the environment and associate those

images with commands or controls that will lead the robot to its final destination

• Gaussier et al. (1997)– Neural networks: map perception to action– 270 degree image of the environment– Local views (subwindows) at x-positions of maximum intensity values

Mapless Navigation:

Appearance-Based Matching• Gaussier et al. (1997)

– ‘Local views’ define a place in the environment

– Each place is asociated with a direction (azimuth) towards the goal

– A neural network learns to associate views/place with direction

Mapless Navigation:

Appearance-Based Matching• Ohno et al. (1996)

– VSSR: View-Sequenced Route Representation– Correlate video input with database images to determine the position of the

robot– Use dispacement between the view and template image to compute real

world dispacement and required steering actions

Mapless Navigation:

Object Recognition• Kim & Nevatia (1995)

– Symbolic navigation approach• E.g. “go to the desk in front of you”

– Establish landmarks from command– S-map: “squeezed 3D into 2D space map”– GPS-like path planner

Outline: Types of Navigation

Navigation

Indoor

Outdoor

(Structured)Map-Based

Map-Building

Mapless(Unstructured)

Structured

Unstructured

Absolute localization

Incremental localization

Optic flow

Appearance based

Object recognition

Landmark tracking

Outdoor Navigation

Comparable to indoor navigation: Obstacle-avoidance, landmark detection, map building/updating, position estimation

Normally no a priori map of the environment

Structured: e.g. Road-following

Unstructured: outdoor environment with no regular properties. e.g. Planetary terrain navigation

Illumination

Outline: Types of Navigation

Navigation

Indoor

Outdoor

(Structured)Map-Based

Map-Building

Mapless(Unstructured)

Structured

Unstructured

Absolute localization

Incremental localization

Optic flow

Appearance based

Object recognition

Landmark tracking

Outdoor Navigation:

Structured Environments Road following car: NAVLAB 1

– 3D vision for obstacle detection and avoidance

– Color vision for road following

• Pixel classification: determine the probability of every pixel to belong to the representation of the road

• Color: road reagions tend to appear more blue

• Texture: road regions tend to appear much smoother compared to non-road regions

• Hough-like transform: determine the road-vanishingpoint and orientation

• Reclassify pixels: taking into account the determined road edges.

Outdoor Navigation:

Structured Environments ALLVIN:

Autonomous Land Vehicle In A Neural Network (first reported in 1989)

Idea: Learn driving by watching a human driver

NN: Back propagation

Outdoor Navigation:

Structured Environments Gaussian distribution

of activations:

xi = activation level output node i

di = distance ith node and steering angle

xi=e−d i2/10

Outdoor Navigation:

Structured Environments Training with synthetic images Training “on the fly”

− No experiences of situations that require correction− Forgetting due to long strait roads

Solution: adding distorted images

Outdoor Navigation:

Structured Environments ALVINN-VC (Virtual Camera)

– Allows the system to detect road changes and intersections before they get too close to the vehicle

IRRE: Input Reconstruction Reliability Estimation

– Using the neural network's internal representation to reconstruct the original image

– Correlate this with the actual input to measure the network's reliability

Outline: Types of Navigation

Navigation

Indoor

Outdoor

(Structured)Map-Based

Map-Building

Mapless(Unstructured)

Structured

Unstructured

Absolute localization

Incremental localization

Optic flow

Appearance based

Object recognition

Landmark tracking

Outdoor Navigation:

Unstructured Environments• Outdoor environment with no regular properties

– Wandering / exploring

– Goal position: need for some map building and localization algorithm

• Vehicle centered coordinate frame

• External reference (e.g. an external camera)

• Global positioning reference: (e.g. mountain peaks, the sun)

Outdoor Navigation:

Unstructured Environments• Mars Pathfinder project

• Launched in December 1996, landed in July 1997

Outdoor Navigation:

Unstructured Environments• Human operators specified

waypoints in 3D views of the landing site once a day

• Deadreckoning-based positioning

• Moving speed: 15 cm / s

• Hazard detection every 6.5 cm

• Maximum travel distance 10 m / day

Outdoor Navigation:

Illumination● Problem: differences in

contrast and texture due to variations in illumination

● Use of color to compensate● Lorigo et al. (1997)

− Exploring robot: Figure out position of obstacles.

− Vertical slices: histogram of intensity values (RGB, HSV, or BW)

− Compare with “safe window”

Questions?