chapter 7 supervised hebbian learning

27
Chapter 7 Supervised Hebbian Learning

Upload: ginata

Post on 22-Feb-2016

170 views

Category:

Documents


3 download

DESCRIPTION

Chapter 7 Supervised Hebbian Learning. Outline. Linear Associator The Hebb Rule Pseudoinverse Rule Application. Linear Associator. Hebb ’ s Postulate. - PowerPoint PPT Presentation

TRANSCRIPT

Chapter 7Supervised Hebbian

Learning

Outline Linear Associator The Hebb Rule Pseudoinverse Rule Application

Linear Associator

a Wp=ai w ijp j

j 1=

R

=

Hebb’s Postulate“When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part in firing it, some growth process or metabolic change takes place in one or both cells such that A’s efficiency, as one of the cells firing B, is increased.”

D. O. Hebb, 1949

A

B

Hebb Rule(1/2)

Hebb Rule(2/2)

Batch Operation

Performance Analysis(1/2)

Input patterns are orthonormal,

Performance Analysis(2/2)

Example(orthonormal)

Example(not orthogonal)

Example(not orthogonal)

Pseudoinverse Rule(1/3)

Pseudoinverse Rule(2/3)

Pseudoinverse Rule(3/3) is Moore-Penrose Pseudoinverse. The Pseudoinverse of a real matrix P is the uniqui matrix that satisfies

Relationship to the Hebb Rule

Relationship to the Hebb Rule

Example

p1

1–11–

t1 1–= =

p2

111–

t2 1= =

W TP+1– 1

1– 11 11– 1–

+

= =

P+ PTP 1–PT 3 1

1 3

1–1– 1 1–1 1 1–

0.5– 0.25 0.25–0.5 0.25 0.25–

= = =

W T P+1– 1

0.5– 0.25 0.25–0.5 0.25 0.25–

1 0 0= = =

Wp1 1 0 01–11–

1–= = Wp2 1 0 0111–

1= =

Autoassociative Memory

Tests50% Occluded

67% Occluded

Noisy Patterns (7 pixels)

Variations of Hebbian Learning

Wnew Wold tqpqT

+=

Wnew Wold tqpq

T+=

Wnew Wold tqpq

TWold

–+ 1 – Woldtqpq

T+= =

Wnew Wol d tq aq– pqT+=

Wnew Wold aqpqT+=

Basic Rule:

Learning Rate:

Smoothing:

Delta Rule:

Unsupervised:

Solved Problems

Solved Problems

Solved ProblemsSolution:

Solved Problems

Solved Problems

Solved Problems