computer vision. applications and algorithms in cv tutorial 3: grouping & fitting primitives...

Post on 23-Dec-2015

216 Views

Category:

Documents

1 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Computer vision

Applications and Algorithms in CV

Tutorial 3:Grouping & Fitting

Primitives detection

Problem:In automated analysis of digital images, a sub-problem often arises of detecting simple

shapes, such as straight lines, circles or ellipses

introduction

Applications and Algorithms in CV

Tutorial 3:Grouping & Fitting

Primitives detection

Solution?

Edge detection

introduction

Applications and Algorithms in CV

Tutorial 3:Grouping & Fitting

Finding straight lines

An edge detector can be used as a pre-processing stage to obtain points (pixels) that are on the desired curve in the image space

Due to imperfections in either the image data or the edge detector, there may be missing points on the desired curves as well as spatial deviations between the ideal line/circle/ellipse and the noisy edge points as they are obtained from the edge detector

It is often non-trivial to group the extracted edge features to an appropriate set of lines, circles or ellipses

introduction

Applications and Algorithms in CV

Tutorial 3:Grouping & Fitting

Grouping & Fitting methods

1. Global optimization / Search for parametersLeast squares fitTotal least squares fitRobust least squaresIterative closest point (ICP)Etc.

introduction

2. Hypothesize and testHough transformGeneralized Hough transformRANSACEtc.

2 approaches for grouping & fitting

Applications and Algorithms in CV

Tutorial 3:Grouping & Fitting

Grouping & Fitting methods

1. Global optimization / Search for parametersLeast squares fit

introduction

Applications and Algorithms in CV Least square fitTutorial 3:

Grouping & Fitting

Least square line fitting

Data: (x1, y1), …, (xn, yn)

2 2 0T TdE

dP A Ap A y

22 1 1

2

1

1

1 2( ) ( ) ( )

1

n T T Ti ii

n n

x ym m

E x yb b

x y

Ap y y y Ap y Ap Ap

n

i ii bxmyE1

2)(

(xi, yi)

yAAApyAApA TTTT 1 Matlab: p = A \ y;

Line equation: yi = m xi + b Find (m, b) to minimize:

Applications and Algorithms in CV Least square fitTutorial 3:

Grouping & Fitting

Least square line fitting

Question: Will least squares work in this case?• Fails completely for vertical lines

•No. same edge, different parametersQuestion: is LSQ invariant to rotation?

m1,b1

m2,b2

Applications and Algorithms in CV

Tutorial 3:Grouping & Fitting

Grouping & Fitting methods

1. Global optimization / Search for parameters

Total least squares fit

introduction

2 approaches for grouping & fitting

Applications and Algorithms in CV Least square fitTutorial 3:

Grouping & Fitting

Total least square

(xi, yi)

ax+by+c=0

Unit normal:2

1

2 2

( )

. . 1

n

i iiE ax by c

s t n a b

Find (a, b, c) to minimize the sum of squared perpendicular distances

2 2

distance from a point , to a line 0 :i i

i i

x y ax by c

ax by cd

a b

2 2

a bn

a b

Applications and Algorithms in CV Least square fitTutorial 3:

Grouping & Fitting

0)(21

n

i ii cybxac

E1 1

n n

i ii i

a bc x y ax by

n n

2

1 12

1( ( ) ( ))

n T Ti ii

n n

x x y ya

E a x x b y yb

x x y y

p A Ap

*Solution is eigenvector corresponding to smallest eigenvalue of ATA

minimize s.t. 1T T T p A Ap p p

Total least square

Applications and Algorithms in CV Least square fitTutorial 3:

Grouping & Fitting

Recap: Two Common Optimization Problems

Problem statement Solution

minimize

s.t. 1

T T

T

x A Ax

x x 1... 1

[ , ] eig( )

:

T

n nn

v A A

x v

Problem statement Solution

bAx osolution t squaresleast bAx \

2 minimize bAx bAAAx TT 1

(matlab)

LSQ & TLSQ analitic solutions

Applications and Algorithms in CV Least square fitTutorial 3:

Grouping & Fitting

Least square line fitting

Least squares fit to the red points

Applications and Algorithms in CV Least square fitTutorial 3:

Grouping & Fitting

Least square line fitting:: robustness to noise

Least squares fit with an outlier

Squared error heavily penalizes outliers

Applications and Algorithms in CV Least square fitTutorial 3:

Grouping & Fitting

Least square line fitting:: conclusions

Good

• Clearly specified objective

• Optimization is easy (for least squares)

Bad

• Not appropriate for non-convex objectives– May get stuck in local minima

• Sensitive to outliers– Bad matches, extra points

• Doesn’t allow you to get multiple good fits– Detecting multiple objects, lines, etc.

Applications and Algorithms in CV

Tutorial 3:Grouping & Fitting

Grouping & Fitting methods

introduction

2. Hypothesize and testHough transform

2 approaches for grouping & fitting

Applications and Algorithms in CV Hough TransformTutorial 3:

Grouping & Fitting

The purpose of the Hough transform is to address the problem of primitives detection by performing an explicit voting procedure over a set of parameterized image objects

Hough transform

Applications and Algorithms in CV Hough TransformTutorial 3:

Grouping & Fitting

A line y = mx + b can be described as the set of all points comprising it: (x1, y1), (x2, y2)...

Hough transform:: Detecting straight lines

Instead, a straight line can be represented as a point (b, m) in the parameter space

Question: does {b,m} (slope and intercept) indeed the best parameter space?

Applications and Algorithms in CV Hough TransformTutorial 3:

Grouping & Fitting

Hough transform:: Line parameterization

Hough transform is calculated in polar coordinates

The parameter represents the distance between the line and the origin

is the angle of the vector from the origin to this closest point

Applications and Algorithms in CV Hough TransformTutorial 3:

Grouping & Fitting

Using this parameterization, the equation of the line can be written as:

which can be rearranged to:

Hough transform:: Line parameterization

Applications and Algorithms in CV Hough TransformTutorial 3:

Grouping & Fitting

Hough transform:: Example

Question: Is there a straight line connecting the 3 dots?

Applications and Algorithms in CV Hough TransformTutorial 3:

Grouping & Fitting

Hough Solution:

1. For each data point, a number of lines are plotted going through it, all at different angles () (shown as solid lines)

2. Calculate For each solid line from step 1 (shown as dashed lines)

3. For each data point, organize the results in a table

Hough transform:: Example

point1 point2 point3

Applications and Algorithms in CV Hough TransformTutorial 3:

Grouping & Fitting

Hough Solution:

4. Draw the lines received for each data point in the Hough space

5. Intersection of all lines indicate a line passing through all data points (remember: each point in Hough space represent a line in the original space)

Hough transform:: Example

Applications and Algorithms in CV Hough TransformTutorial 3:

Grouping & Fitting

features votes

Hough transform:: Example

Applications and Algorithms in CV Hough TransformTutorial 3:

Grouping & Fitting

Hough transform:: other shapes

Applications and Algorithms in CV Hough TransformTutorial 3:

Grouping & Fitting

features

Hough transform:: effect of noise

Applications and Algorithms in CV Hough TransformTutorial 3:

Grouping & Fitting

features votes

Hough transform:: effect of noise

Applications and Algorithms in CV Hough TransformTutorial 3:

Grouping & Fitting

features votes

Hough transform:: random points

Applications and Algorithms in CV Hough TransformTutorial 3:

Grouping & Fitting

Random points:: Dealing with noise

Grid resolution tradeoff Too coarse: large votes obtained when too many different lines correspond to a single

bucket Too fine: miss lines because some points that are not exactly collinear cast votes for different

buckets

Increment neighboring bins Smoothing in accumulator array

Try to get rid of irrelevant features Take only edge points with significant gradient magnitude

Applications and Algorithms in CV Hough TransformTutorial 3:

Grouping & Fitting

Good1. Robust to outliers: each point votes separately2. Fairly efficient (often faster than trying all sets of parameters)3. Provides multiple good fits

Bad4. Some sensitivity to noise5. Bin size trades off between noise tolerance, precision, and speed/memory6. Not suitable for more than a few parameters (grid size grows exponentially)

Random points:: conclusions

RANSAC

(RANdom SAmple Consensus):Learning technique to estimate parameters of a model by random sampling of observed data

RANSAC

Applications and Algorithms in CV RANSACTutorial 3:

Grouping & Fitting

RANSAC:: algorithm

Algorithm:1. Sample (randomly) the number of points required to fit the model2. Solve for model parameters using samples 3. Score by the fraction of inliers within a preset threshold of the model

Repeat 1-3 until the best model is found with high confidence

Applications and Algorithms in CV RANSACTutorial 3:

Grouping & Fitting

RANSAC:: line fitting example

Algorithm:1. Sample (randomly) the number of points required to fit the model 2. Solve for model parameters using samples 3. Score by the fraction of inliers within a preset threshold of the model

Repeat 1-3 until the best model is found with high confidence

Applications and Algorithms in CV RANSACTutorial 3:

Grouping & Fitting

RANSAC:: line fitting example

Algorithm:1. Sample (randomly) the number of points required to fit the model 2. Solve for model parameters using samples 3. Score by the fraction of inliers within a preset threshold of the model

Repeat 1-3 until the best model is found with high confidence

Applications and Algorithms in CV RANSACTutorial 3:

Grouping & Fitting

RANSAC:: line fitting example

Algorithm:1. Sample (randomly) the number of points required to fit the model 2. Solve for model parameters using samples 3. Score by the fraction of inliers within a preset threshold of the model

Repeat 1-3 until the best model is found with high confidence

6IN

Algorithm:1. Sample (randomly) the number of points required to fit the model 2. Solve for model parameters using samples 3. Score by the fraction of inliers within a preset threshold of the model

Repeat 1-3 until the best model is found with high confidence

Applications and Algorithms in CV RANSACTutorial 3:

Grouping & Fitting

RANSAC:: line fitting example

14IN

Applications and Algorithms in CV RANSACTutorial 3:

Grouping & Fitting

:initial number of points (Typically minimum number needed to fit the model) :number of RANSAC iterations

: probability of choosing inliers at least in some iteration :probability of choosing an inliers :probability that all s points are inliers:probability that at least one point is an outlier

: probability that the algorithm never selects a set of s inliers

log 1 / log 1 sN p

RANSAC:: parameter selection

Applications and Algorithms in CV RANSACTutorial 3:

Grouping & Fitting

GoodRobust to outliersApplicable for larger number of parameters than Hough transformParameters are easier to choose than Hough transform

BadComputational time grows quickly with fraction of outliers and number

of parameters Not good for getting multiple fits

RANSAC:: conclusions

top related