visual odometry for ground vehicle applications david nister, oleg naroditsky, james bergen sarnoff...

Post on 21-Dec-2015

216 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Visual Odometry for Ground Vehicle Applications

David Nister, Oleg Naroditsky, James Bergen

Sarnoff Corporation, CN5300 Princeton, NJ 08530

CPSC 643, Presentation 4

Journal of Field Robotics.Vol. 23, No. 1, pp. 3-200, 2006.

Mostly Related Works

• Stereo Visual Odometry – Agrawal 2007• Monocular Visual Odometry – Campbell 2005• A Visual Odometry System – Olson 2003• Previous Work of this Paper – Nister 2004

Mostly Related Works

• Integrate with IMU/GPS• Bundle Adjustment

• Stereo Visual Odometry – Agrawal 2007• Monocular Visual Odometry – Campbell 2005• A Visual Odometry System – Olson 2003• Previous Work of this Paper – Nister 2004

Mostly Related Works

• 3-DOF Camera• Optical Flow Method

• Stereo Visual Odometry – Agrawal 2007• Monocular Visual Odometry – Campbell 2005• A Visual Odometry System – Olson 2003• Previous Work of this Paper – Nister 2004

Mostly Related Works

• Stereo Visual Odometry – Agrawal 2007• Monocular Visual Odometry – Campbell 2005• A Visual Odometry System – Olson 2003• Previous Work of this Paper – Nister 2004

• With absolute orientation sensor

• Forstner interest operator in the left Image, matches from left to right

• Use approximate prior knowledge

• Iteratively select landmark points

Mostly Related Works

• Stereo Visual Odometry – Agrawal 2007• Monocular Visual Odometry – Campbell 2005• A Visual Odometry System – Olson 2003• Previous Work of this Paper – Nister 2004

• Estimates ego-motion using a hand-held Camera• Real-time algorithm based on RANSIC

Other Related Works

• Simultaneous Localization and Mapping (SLAM)

• Robot Navigation with Omnidirectional Camera

Motivation

Olson:

With absolute orientation sensor

Forstner interest operator in the left Image, matches from left toright

Use approximate prior knowledge

Iteratively select landmark points

Nister:

Use pure visual information

Use Harris corner detection in allimages, track feature to feature

No prior knowledge

RANSIC based estimation in real-time

Feature Detection

Harris Corner Detection

Search for the local maxima of the corner strength .

determinant, trance, constant, window area,

derivatives of input image, weight function.

( , )s x yd t k

,x yI I w,a b

max

Feature Detection

Four Sweeps to Calculate

• Compute , by filters and .• Calculate the horizontal sum by filter .• Calculate the vertical sum by filter .• Calculate corner strength .

( , )s x y

, ,x x x y y yI I I I I I [ 1,0,1] [ 1,0,1]T

[1,4,6,4,1][1,4,6,4,1]T

( , )s x y

101 11

46

1

41

4 61 4 1

( , )S x y

Feature Detection

Detected Feature Points

Superimposed feature tracks through images

Feature Matching

Two Directional Matching

• Calculate the normalized correlation in reign, where , are two consecutive input images.

• Match the feature points in the circular area that have the maximum correlation in two directions.

21 2 1 2

2 2 2 21 1 2 2

max ( ) ( )

n I I I I

n I I n I I

1I 2In n 11n

n n n n

Robust Estimation

The Monocular Scheme

• Separate the matching points into 5-points groups.

• Treat each group as a 5-poi-nt relative pose problem.

• Use RANSAC to select well matched groups.

• Estimate camera motion us-ing the selected groups.

• Put the current estimation in to the coordinate of the prev-ious one.

Robust Estimation

The Stereo Scheme

• Match the feature points in stereo images, then triangulatethem into 3D points.

• Estimation the camera motion using RANSAC and the 3D points in consecutive frames.

3D point

Experiments

Different Platforms

Experiments

Speed and Accuracy

Experiments

Visual Odometry vs. Differential GPS

Experiments

Visual Odometry vs. Inertial Navigation System (INS)

Experiments

Visual Odometry vs. Wheel Recorder

Conclusion and Future Work

Conclusion

• A real-time ego motion estimation system.

• Work both on monocular camera and stereo head.

• Results are accurate and robust.

Future Work

• Integrate visual odometry with Kalman filter.

• Use sampling methods with multimodal distributions.

top related