multiple view geometry

39
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai

Upload: sienna

Post on 10-Feb-2016

39 views

Category:

Documents


1 download

DESCRIPTION

Multiple View Geometry. Marc Pollefeys University of North Carolina at Chapel Hill. Modified by Philippos Mordohai. Outline. Parameter estimation Robust estimation (RANSAC) Chapter 3 of “Multiple View Geometry in Computer Vision” by Hartley and Zisserman. Parameter estimation. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Multiple View Geometry

Multiple View Geometry

Marc PollefeysUniversity of North Carolina at Chapel Hill

Modified by Philippos Mordohai

Page 2: Multiple View Geometry

2

Outline

• Parameter estimation• Robust estimation (RANSAC)

• Chapter 3 of “Multiple View Geometry in Computer Vision” by Hartley and Zisserman

Page 3: Multiple View Geometry

3

Parameter estimation

• 2D homographyGiven a set of (xi,xi’), compute H (xi’=Hxi)

• 3D to 2D camera projectionGiven a set of (Xi,xi), compute P (xi=PXi)

• Fundamental matrixGiven a set of (xi,xi’), compute F (xi’TFxi=0)

• Trifocal tensor Given a set of (xi,xi’,xi”), compute T

Page 4: Multiple View Geometry

4

Number of measurements required

• At least as many independent equations as degrees of freedom required

• Example:

Hxx'

11λ

333231

232221

131211

yx

hhhhhhhhh

yx

2 independent equations / point8 degrees of freedom4x2≥8

Page 5: Multiple View Geometry

5

Approximate solutions

• Minimal solution4 points yield an exact solution for H

• More points– No exact solution, because

measurements are inexact (“noise”)– Search for “best” according to some cost

function– Algebraic or geometric/statistical cost

Page 6: Multiple View Geometry

6

Gold Standard algorithm

• Cost function that is optimal for some assumptions

• Computational algorithm that minimizes it is called “Gold Standard” algorithm

• Other algorithms can then be compared to it

Page 7: Multiple View Geometry

7

Direct Linear Transformation(DLT)

ii Hxx 0Hxx ii

i

i

i

i

xhxhxh

Hx3

2

1

T

T

T

iiii

iiii

iiii

ii

yxxwwy

xhxhxhxhxhxh

Hxx12

31

23

TT

TT

TT

0hhh

0xxx0x

xx0

3

2

1

TTT

TTT

TTT

iiii

iiii

iiii

xyxw

yw

Tiiii wyx ,,x

0hA i

Page 8: Multiple View Geometry

8

Direct Linear Transformation(DLT)

• Equations are linear in h

0hhh

0xxx0x

xx0

3

2

1

TTT

TTT

TTT

iiii

iiii

iiii

xyxw

yw

0AAA 321 iiiiii wyx

0hA i• Only 2 out of 3 are linearly independent

(indeed, 2 eq/pt)

0hhh

x0xxx0

3

2

1

TTT

TTT

iiii

iiii

xwyw

(only drop third row if wi’≠0)• Holds for any homogeneous

representation, e.g. (xi’,yi’,1)

Page 9: Multiple View Geometry

9

Direct Linear Transformation(DLT)

• Solving for H

0Ah 0h

AAAA

4

3

2

1

size A is 8x9 or 12x9, but rank 8

Trivial solution is h=09T is not interesting

1-D null-space yields solution of interestpick for example the one with 1h

Page 10: Multiple View Geometry

10

Direct Linear Transformation(DLT)

• Over-determined solution

No exact solution because of inexact measurementi.e. “noise”

0Ah 0h

A

AA

n

2

1

Find approximate solution- Additional constraint needed to avoid 0, e.g.- not possible, so minimize

1h Ah0Ah

Page 11: Multiple View Geometry

11

Singular Value Decomposition

XXVT XVΣ T XVUΣ T

Homogeneous least-squares

TVUΣA

1X AXmin subject to nVX solution

Page 12: Multiple View Geometry

12

DLT algorithm

ObjectiveGiven n≥4 2D to 2D point correspondences {xi↔xi’}, determine the 2D homography matrix H such that xi’=Hxi

Algorithm(i) For each correspondence xi ↔xi’ compute Ai. Usually

only two first rows needed.

(ii) Assemble n 2x9 matrices Ai into a single 2nx9 matrix A

(iii) Obtain SVD of A. Solution for h is last column of V(iv) Determine H from h

Page 13: Multiple View Geometry

13

Inhomogeneous solution

''

h~''000''''''''000

ii

ii

iiiiiiiiii

iiiiiiiiii

xwyw

xyxxwwwywxyyyxwwwywx

Since h can only be computed up to scale, pick hj=1, e.g. h9=1, and solve for 8-vector

h~

Solve using Gaussian elimination (4 points) or using linear least-squares (more than 4 points)However, if h9=0 this approach fails also poor results if h9 close to zeroTherefore, not recommendedNote h9=H33=0 if origin is mapped to infinity

0100

H100Hxl 0

T

Page 14: Multiple View Geometry

14

Degenerate configurationsx4

x1

x3

x2x4

x1

x3

x2H? H’?

x1

x3

x2

x4

0Hxx iiConstraints: i=1,2,3,4

TlxH 4* Define:

4444* xxlxxH kT

3,2,1 ,0xlxxH 4* iii

TThen,

H* is rank-1 matrix and thus not a homography

(case A) (case B)

If H* is unique solution, then no homography mapping xi→xi’(case B)If further solution H exist, then also αH*+βH (case A) (2-D null-space in stead of 1-D null-space)

Page 15: Multiple View Geometry

15

Solutions from lines, etc.

ii lHl T 0Ah

2D homographies from 2D lines

Minimum of 4 lines

Minimum of 5 points or 5 planes3D Homographies (15 dof)

2D affinities (6 dof)Minimum of 3 points or lines

Conic provides 5 constraintsMixed configurations?

Page 16: Multiple View Geometry

16

Cost functions

• Algebraic distance• Geometric distance• Reprojection error

• Comparison• Geometric interpretation• Sampson error

Page 17: Multiple View Geometry

17

Algebraic distanceAhDLT minimizes

Ahe residual vector

ie partial vector for each (xi↔xi’)algebraic error vector

2

22alg h

x0xxx0

Hx,x

TTT

TTT

iiii

iiiiiii xw

ywed

algebraic distance 2

221

221alg x,x aad where 21

T321 xx,,a aaa

2222alg AhHx,x eed

ii

iii

Not geometrically/statistically meaningfull, but given good normalization it works fine and is very fast (use for initialization)

Page 18: Multiple View Geometry

18

Geometric distancemeasured coordinatesestimated coordinatestrue coordinates

xxx

2H

xH,xargminH iii

d Error in one image

e.g. calibration pattern

221-

HHx,xxH,xargminH iiii

i

dd Symmetric transfer error

d(.,.) Euclidean distance (in image)

ii

iiiii

ii ddii

xHx subject to

x,xx,xargminx,x,H 22

x,xH,

Reprojection error

Page 19: Multiple View Geometry

19

Reprojection error

221- Hx,xxHx, dd

22 x,xxx, dd

Page 20: Multiple View Geometry

20

Comparison of geometric and algebraic distances

iiii

iiiiii wxxw

ywwye

ˆˆˆˆ

hA

Error in one image Tiiii wyx ,,x xHˆ,ˆ,ˆx T

iiii wyx

3

2

1

hhh

x0xxx0

TTT

TTT

iiii

iiii

xwyw

222iialg ˆˆˆˆx,x iiiiiiii wxxwywwyd

ii

iiiiiiii

wwdwxwxwywyd

ˆ/x,x

/ˆ/ˆˆ/ˆ/x,x

iialg

2/1222ii

typical, but not , except for affinities 1iw i3xhˆ iwFor affinities DLT can minimize geometric distancePossibility for iterative algorithm

Page 21: Multiple View Geometry

21

Geometric interpretation of reprojection error

Analog to conic fitting

CxxC,x T2alg d

2C,xd

Page 22: Multiple View Geometry

22

2XX

HνVector that minimizes the geometric error is the closest point on the variety to the measurement

XX

Sampson errorbetween algebraic and geometric error

XSampson error: 1st order approximation of 0XAh H C

XH

HXH δX

XδX

CCC XXδX 0XH C

0δX

X XH

H

CC eXJδ

Find the vector that minimizes subject to

Xδ eXJδXδ

XJ Hwith

C

Page 23: Multiple View Geometry

23

Find the vector that minimizes subject to

Xδ eXJδXδ

Use Lagrange multipliers:

minimize

derivatives

0Jδ2λ-δδ XXX eT

TT 0J2λ-δ2 X 0Jδ2 X e

λJδ XT

0λJJ T e

e1TJJλ

e1TT

X JJJδ

XδXX ee1TT

XTX

2X JJδδδ

Page 24: Multiple View Geometry

24

Sampson error2

XX Hν

Vector that minimizes the geometric error is the closest point on the variety to the measurement

XX

between algebraic and geometric error

XSampson error: 1st order approximation of 0XAh H C

XH

HXH δX

XδX

CCC XXδX 0XH C

0δX

X XH

H

CC eXJδ

Find the vector that minimizes subject to

Xδ eXJδXδ

ee1TT

XTX

2X JJδδδ

(Sampson

error)

Page 25: Multiple View Geometry

25

Sampson approximation

A few points:• For a 2D homography X=(x,y,x’,y’) • is the algebraic error vector• is a 2x4 matrix, e.g.• Similar to algebraic error in fact, same as

Mahalanobis distance• Sampson error independent of linear reparametrization

(cancels out in between e and J)• Must be summed for all points• Close to geometric error, but much fewer unknowns

ee1TT2

X JJδ

eee T2 2

JJT e

31213T2T

11 /hxhx hyhwxywJ iiiiii XJ H

C XHCe

ee

1TT JJ

Page 26: Multiple View Geometry

26

Statistical cost function and Maximum Likelihood Estimation

• Optimal cost function related to noise model• Assume zero-mean isotropic Gaussian noise

(assume outliers removed)

22 2/xx,2πσ2

1xPr de

22i 2xH,x /

2i πσ2

1H|xPr idei

constantxH,xH|xPrlog 2i2i

σ2

1id

Error in one image

Maximum Likelihood Estimate: minimize 2

i xH,x id

Page 27: Multiple View Geometry

27

Statistical cost function and Maximum Likelihood Estimation

• Optimal cost function related to noise model• Assume zero-mean isotropic Gaussian noise

(assume outliers removed)

22 2/xx,2πσ2

1xPr de

22i

2i 2xH,xx,x /

2i πσ2

1H|xPr

ii dd

ei

Error in both images

Maximum Likelihood Estimate: minimize

2i2

i x,xx,x ii dd

Page 28: Multiple View Geometry

28

Mahalanobis distance

• General Gaussian caseMeasurement X with covariance matrix Σ

XXXXXX 1T2

22XXXX

Error in two images (independent)

22XXXX

iiii

iii

Varying covariances

Page 29: Multiple View Geometry

29

Outline

• Parameter estimation• Robust estimation (RANSAC)

• Chapter 3 of “Multiple View Geometry in Computer Vision” by Hartley and Zisserman

Page 30: Multiple View Geometry

30

Robust estimation

• What if set of matches contains gross outliers?

Page 31: Multiple View Geometry

31

RANSACObjective

Robust fit of model to data set S which contains outliers

Algorithm(i) Randomly select a sample of s data points from S and

instantiate the model from this subset.(ii) Determine the set of data points Si which are within a

distance threshold t of the model. The set Si is the consensus set of samples and defines the inliers of S.

(iii) If the subset of Si is greater than some threshold T, re-estimate the model using all the points in Si and terminate

(iv) If the size of Si is less than T, select a new subset and repeat the above.

(v) After N trials the largest consensus set Si is selected, and the model is re-estimated using all the points in the subset Si

Page 32: Multiple View Geometry

32

How many samples?• Choose t so probability for inlier is α (e.g. 0.95)

– Or empirically• Choose N so that, with probability p, at least one random

sample is free from outliers. e.g. p =0.99

sepN 11log/1log

peNs 111

proportion of outliers es 5% 10% 20% 25% 30% 40% 50%2 2 3 5 6 7 11 173 3 4 7 9 11 19 354 3 5 9 13 17 34 725 4 6 12 17 26 57 1466 4 7 16 24 37 97 2937 4 8 20 33 54 163 5888 5 9 26 44 78 272 1177

Page 33: Multiple View Geometry

33

Acceptable consensus set?

• Typically, terminate when inlier ratio reaches expected ratio of inliers

neT 1

Page 34: Multiple View Geometry

34

Adaptively determining the number of samples

e is often unknown a priori, so pick worst case, e.g. 50%, and adapt if more inliers are found, e.g. 80% would yield e=0.2

– N=∞, sample_count =0– While N >sample_count repeat

• Choose a sample and count the number of inliers• Set e=1-(number of inliers)/(total number of points)• Recompute N from e• Increment the sample_count by 1

– Terminate

sepN 11log/1log

Page 35: Multiple View Geometry

35

Robust Maximum Likelihood Estimation

Previous MLE algorithm considers fixed set of inliers

Better, robust cost function (reclassifies)

outlier inlier

ρ with ρ222

222

i tettee

ed iD

Page 36: Multiple View Geometry

36

Other robust algorithms

• RANSAC maximizes number of inliers• LMedS minimizes median error

– No threshold– Needs estimate of percentage of outliers

• Not recommended: case deletion, iterative least-squares, etc.

Page 37: Multiple View Geometry

37

Automatic computation of HObjective

Compute homography between two imagesAlgorithm(i) Interest points: Compute interest points in each image(ii) Putative correspondences: Compute a set of interest

point matches based on some similarity measure(iii) RANSAC robust estimation: Repeat for N samples

(a) Select 4 correspondences and compute H(b) Calculate the distance d for each putative match

(c) Compute the number of inliers consistent with H (d<t)Choose H with most inliers

(iv) Optimal estimation: re-estimate H from all inliers by minimizing ML cost function with Levenberg-Marquardt

(v) Guided matching: Determine more matches using prediction by computed H

Optionally iterate last two steps until convergence

Page 38: Multiple View Geometry

38

Determine putative correspondences

• Compare interest pointsSimilarity measure:– SAD, SSD, ZNCC on small neighborhood

• If motion is limited, only consider interest points with similar coordinates

• More advanced approaches exist, based on invariance…

Page 39: Multiple View Geometry

39

Example: robust computation

Interest points(500/image)(640x480)

Putative correspondences (268)(Best match,SSD<20,±320)

Outliers (117)(t=1.25 pixel; 43 iterations)

Inliers (151)

Final inliers (262)(2 MLE-inlier cycles; d=0.23→d=0.19; IterLev-Mar=10)

#in 1-e adapt. N

6 2% 20M10 3% 2.5M44 16% 6,92258 21% 2,29173 26% 911

151 56% 43