digital image processing lecture 24: object recognition june 13, 2005 prof. charlene tsai *from...

12
Digital Image Processing Lecture 24: Object Recognition June 13, 2005 Prof. Charlene Tsai *From Gonzalez Chapter 12

Upload: jocelyn-nash

Post on 13-Dec-2015

218 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: Digital Image Processing Lecture 24: Object Recognition June 13, 2005 Prof. Charlene Tsai *From Gonzalez Chapter 12

Digital Image Processing Lecture 24: Object

RecognitionJune 13, 2005

Prof. Charlene TsaiProf. Charlene Tsai

*From Gonzalez Chapter 12

Page 2: Digital Image Processing Lecture 24: Object Recognition June 13, 2005 Prof. Charlene Tsai *From Gonzalez Chapter 12

2

Terminology

A pattern (x,y,z): arrangement of descriptors (those discussed in previous 2 lectures)

A feature: another name for a descriptor in pattern recognition

A pattern class : a family of patterns that share some common properties.

Page 3: Digital Image Processing Lecture 24: Object Recognition June 13, 2005 Prof. Charlene Tsai *From Gonzalez Chapter 12

3

Example

2

1xx

xPetal width

Petal length

1

2

3

Is the feature selection good enough?

Page 4: Digital Image Processing Lecture 24: Object Recognition June 13, 2005 Prof. Charlene Tsai *From Gonzalez Chapter 12

4

Decision-Theoretic Methods

Assuming W classes ( ), we want to find decision functions with the property that if pattern x belongs to class , then

The decision boundary separating two classes is the set of x for which

W ,...,, 21

x,...,x,x 21 wddd

i

ijWdd ji ;...,1,2,j xx

0xx xdxdxddd jiijji

Page 5: Digital Image Processing Lecture 24: Object Recognition June 13, 2005 Prof. Charlene Tsai *From Gonzalez Chapter 12

5

Common Approaches

Matching Minimum distance classifier Matching by correlation (skip)

Optimum statistical classifiers Bayes classifier for Gaussian pattern classes

Neural network

Page 6: Digital Image Processing Lecture 24: Object Recognition June 13, 2005 Prof. Charlene Tsai *From Gonzalez Chapter 12

6

Matching–Minimum Distance Classifier Techniques based on matching represent

each class by a prototype pattern vector. An unknown pattern is assigned to the class

to which it is closest in terms of a predefined metric. For MDC, the metric is the Euclidean distance

Page 7: Digital Image Processing Lecture 24: Object Recognition June 13, 2005 Prof. Charlene Tsai *From Gonzalez Chapter 12

7

MDC

The prototype of each pattern class is the mean vector of that class:

The distance metric is the Euclidean distance:

WN jj

j ...,1,2,j x1

mix

WD jj ...,1,2,j m-xx

Euclidean norm

Page 8: Digital Image Processing Lecture 24: Object Recognition June 13, 2005 Prof. Charlene Tsai *From Gonzalez Chapter 12

8

MDC Assign x to class if Dj(x) is the smallest. Smallest Dj(x) is equivalent to largest dj(x),

the decision function:

The decision boundary between classes i and j becomes:

j

Wd jTjj

Tj ...,1,2,j mm

2

1mxx

0mmmm 2

1mmx

xxx

jiT

jijiT

jiij ddd

Page 9: Digital Image Processing Lecture 24: Object Recognition June 13, 2005 Prof. Charlene Tsai *From Gonzalez Chapter 12

9

MDC- Decision Boundary

bisector of the line joining mi and mj. In 2D: bisector is a line In 3D: bisector is a plane

m1=(4.3,1.3)T

m2=(1.5,0.3)T

Page 10: Digital Image Processing Lecture 24: Object Recognition June 13, 2005 Prof. Charlene Tsai *From Gonzalez Chapter 12

10

Comments

Simplest matching method. A class is described by the mean vector Works well for

Large mean separation, and Relatively small class spread

Unfortunately, we don’t often encounter this scenario in pactice.

Page 11: Digital Image Processing Lecture 24: Object Recognition June 13, 2005 Prof. Charlene Tsai *From Gonzalez Chapter 12

11

Quiz

Q1: Compute the decision functions of a minimum distance classifier for the pattern shown in the next page.

Q2: Compute and sketch the decision surfaces implemented by the decision functions in Q1.

Page 12: Digital Image Processing Lecture 24: Object Recognition June 13, 2005 Prof. Charlene Tsai *From Gonzalez Chapter 12

12

m1=(4.3,1.3)T

m2=(1.5,0.3)T

m1=(5.5,2.1)T