face recognition jeremy wyatt. plan eigenfaces: the idea eigenvectors and eigenvalues co-variance...

22
Face Recognition Jeremy Wyatt

Post on 21-Dec-2015

220 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Face Recognition Jeremy Wyatt. Plan Eigenfaces: the idea Eigenvectors and Eigenvalues Co-variance Learning Eigenfaces from training sets of faces Recognition

Face Recognition

Jeremy Wyatt

Page 2: Face Recognition Jeremy Wyatt. Plan Eigenfaces: the idea Eigenvectors and Eigenvalues Co-variance Learning Eigenfaces from training sets of faces Recognition

Plan

• Eigenfaces: the idea

• Eigenvectors and Eigenvalues

• Co-variance

• Learning Eigenfaces from training sets of faces

• Recognition and reconstruction

Page 3: Face Recognition Jeremy Wyatt. Plan Eigenfaces: the idea Eigenvectors and Eigenvalues Co-variance Learning Eigenfaces from training sets of faces Recognition

Eigenfaces: the idea• Think of a face as being a weighted combination of some

“component” or “basis” faces

• These basis faces are called eigenfaces

-8029 2900 1751 1445 4238 6193

Page 4: Face Recognition Jeremy Wyatt. Plan Eigenfaces: the idea Eigenvectors and Eigenvalues Co-variance Learning Eigenfaces from training sets of faces Recognition

Eigenfaces: representing faces• These basis faces can be differently weighted to represent any face

• So we can use different vectors of weights to represent different faces

-8029 -1183 2900 -2088 1751 -4336 1445 -669 4238 -4221 6193 10549

Page 5: Face Recognition Jeremy Wyatt. Plan Eigenfaces: the idea Eigenvectors and Eigenvalues Co-variance Learning Eigenfaces from training sets of faces Recognition

Learning Eigenfaces

Q: How do we pick the set of basis faces?

A: We take a set of real training faces

Then we find (learn) a set of basis faces which best represent the differences between them

We’ll use a statistical criterion for measuring this notion of “best representation of the differences between the training faces”

We can then store each face as a set of weights for those basis faces

Page 6: Face Recognition Jeremy Wyatt. Plan Eigenfaces: the idea Eigenvectors and Eigenvalues Co-variance Learning Eigenfaces from training sets of faces Recognition

Using Eigenfaces: recognition & reconstruction

• We can use the eigenfaces in two ways

• 1: we can store and then reconstruct a face from a set of weights

• 2: we can recognise a new picture of a familiar face

Page 7: Face Recognition Jeremy Wyatt. Plan Eigenfaces: the idea Eigenvectors and Eigenvalues Co-variance Learning Eigenfaces from training sets of faces Recognition

Learning Eigenfaces

• How do we learn them?

• We use a method called Principle Components Analysis (PCA)

• To understand this we will need to understand

– What an eigenvector is

– What covariance is

• But first we will look at what is happening in PCA qualitatively

Page 8: Face Recognition Jeremy Wyatt. Plan Eigenfaces: the idea Eigenvectors and Eigenvalues Co-variance Learning Eigenfaces from training sets of faces Recognition

Subspaces

• Imagine that our face is simply a (high dimensional) vector of pixels

• We can think more easily about 2d vectors

• Here we have data in two dimensions

• But we only really need one dimension to represent it

Page 9: Face Recognition Jeremy Wyatt. Plan Eigenfaces: the idea Eigenvectors and Eigenvalues Co-variance Learning Eigenfaces from training sets of faces Recognition

Finding Subspaces

• Suppose we take a line through the space

• And then take the projection of each point onto that line

• This could represent our data in “one” dimension

Page 10: Face Recognition Jeremy Wyatt. Plan Eigenfaces: the idea Eigenvectors and Eigenvalues Co-variance Learning Eigenfaces from training sets of faces Recognition

Finding Subspaces

• Some lines will represent the data in this way well, some badly

• This is because the projection onto some lines separates the data well, and the projection onto some lines separates it badly

Page 11: Face Recognition Jeremy Wyatt. Plan Eigenfaces: the idea Eigenvectors and Eigenvalues Co-variance Learning Eigenfaces from training sets of faces Recognition

Finding Subspaces

• Rather than a line we can perform roughly the same trick with a vector

• Now we have to scale the vector to obtain any point on the line

i

1

1

16

3

2

Page 12: Face Recognition Jeremy Wyatt. Plan Eigenfaces: the idea Eigenvectors and Eigenvalues Co-variance Learning Eigenfaces from training sets of faces Recognition

Eigenvectors

• An eigenvector is a vector that obeys the following rule:

Where is a matrix, is a scalar (called the eigenvalue)

e.g. one eigenvector of is since

so for this eigenvector of this matrix the eigenvalue is 4

v vA

A

v

2 3

2 1

A

3

2

2 3 3 12 34

2 1 2 8 2

vA

v

A

Page 13: Face Recognition Jeremy Wyatt. Plan Eigenfaces: the idea Eigenvectors and Eigenvalues Co-variance Learning Eigenfaces from training sets of faces Recognition

Eigenvectors

• We can think of matrices as performing transformations on vectors (e.g rotations, reflections)

• We can think of the eigenvectors of a matrix as being special vectors (for that matrix) that are scaled by that matrix

• Different matrices have different eigenvectors

• Only square matrices have eigenvectors

• Not all square matrices have eigenvectors

• An n by n matrix has at most n distinct eigenvectors

• All the distinct eigenvectors of a matrix are orthogonal (ie perpendicular)

Page 14: Face Recognition Jeremy Wyatt. Plan Eigenfaces: the idea Eigenvectors and Eigenvalues Co-variance Learning Eigenfaces from training sets of faces Recognition

Covariance

• Which single vector can be used to separate these points as much as possible?

• This vector turns out to be a vector expressing the direction of the correlation

• Here I have two variables and

• They co-vary (y tends to change in roughly the same direction as x)

1x 2x

1x

2x

Page 15: Face Recognition Jeremy Wyatt. Plan Eigenfaces: the idea Eigenvectors and Eigenvalues Co-variance Learning Eigenfaces from training sets of faces Recognition

Covariance

• The covariances can be expressed as a matrix

• The diagonal elements are the variances e.g. Var(x1)

• The covariance of two variables is:

1x

2x.617 .615

.615 .717C

1x

2x1x

2x

1 1 2 21

1 2

( )( )cov( , )

1

ni i

i

x x x xx x

n

Page 16: Face Recognition Jeremy Wyatt. Plan Eigenfaces: the idea Eigenvectors and Eigenvalues Co-variance Learning Eigenfaces from training sets of faces Recognition

Eigenvectors of the covariance matrix

• The covariance matrix has eigenvectors

covariance matrix

eigenvectors

eigenvalues

• Eigenvectors with larger eigenvectors correspond to directions in which the data varies more

• Finding the eigenvectors and eigenvalues of the covariance matrix for a set of data is termed principle components analysis

1x

2x.617 .615

.615 .717C

1

.735

.678

2

.678

.735

1 0.049 2 1.284

Page 17: Face Recognition Jeremy Wyatt. Plan Eigenfaces: the idea Eigenvectors and Eigenvalues Co-variance Learning Eigenfaces from training sets of faces Recognition

Expressing points using eigenvectors

• Suppose you think of your eigenvectors as specifying a new vector space

• i.e. I can reference any point in terms of those eigenvectors

• A point’s position in this new coordinate system is what we earlier referred to as its “weight vector”

• For many data sets you can cope with fewer dimensions in the new space than in the old space

Page 18: Face Recognition Jeremy Wyatt. Plan Eigenfaces: the idea Eigenvectors and Eigenvalues Co-variance Learning Eigenfaces from training sets of faces Recognition

Eigenfaces

• All we are doing in the face case is treating the face as a point in a high-dimensional space, and then treating the training set of face pictures as our set of points

• To train:

– We calculate the covariance matrix of the faces

– We then find the eigenvectors of that covariance matrix

• These eigenvectors are the eigenfaces or basis faces

• Eigenfaces with bigger eigenvalues will explain more of the variation in the set of faces, i.e. will be more distinguishing

Page 19: Face Recognition Jeremy Wyatt. Plan Eigenfaces: the idea Eigenvectors and Eigenvalues Co-variance Learning Eigenfaces from training sets of faces Recognition

Eigenfaces: image space to face space

• When we see an image of a face we can transform it to face space

• There are k=1…n eigenfaces

• The ith face in image space is a vector

• The corresponding weight is

• We calculate the corresponding weight for every eigenface

.k kv iw x

kv

ix

kw

Page 20: Face Recognition Jeremy Wyatt. Plan Eigenfaces: the idea Eigenvectors and Eigenvalues Co-variance Learning Eigenfaces from training sets of faces Recognition

Recognition in face space

• Recognition is now simple. We find the euclidean distance d between our face and all the other stored faces in face space:

• The closest face in face space is the chosen match

2

1 2 1 2

1

( , )n

i ii

d w w w w

Page 21: Face Recognition Jeremy Wyatt. Plan Eigenfaces: the idea Eigenvectors and Eigenvalues Co-variance Learning Eigenfaces from training sets of faces Recognition

Reconstruction

• The more eigenfaces you have the better the reconstruction, but you can have high quality reconstruction even with a small number of eigenfaces

82 70 50

30 20 10

Page 22: Face Recognition Jeremy Wyatt. Plan Eigenfaces: the idea Eigenvectors and Eigenvalues Co-variance Learning Eigenfaces from training sets of faces Recognition

Summary

• Statistical approach to visual recognition

• Also used for object recognition

• Problems

• Reference: M. Turk and A. Pentland (1991). Eigenfaces for recognition, Journal of Cognitive Neuroscience, 3(1): 71–86.