introduction of multilayer perceptron in python

Post on 02-Feb-2017

236 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Ko, Youngjoong

Dept. of Computer Engineering,

Dong-A University

Introduction of Multilayer Perceptron in Python

1. Non-linear Classification

2. XOR problem

3. Architecture of Multilayer Perceptron (MLP)

4. Forward Computation

5. Activation Functions

6. Learning in MLP with Back-propagation Algorithm

7. Python Code and Practice

2

Contents

Many Impossible Cases to be classified linearly

Linear model: perceptron

Non-linear model: decision tree, nearest neighbor models

Explore to find a non-linear learning model from perceptron

3

Non-linear Classification

Limitation of performance of perceptrons in XOR problem

75% Accuracy

Overcome this limitation by using two perceptrons

4

XOR Problem

Two Steps for Solution

Mapping an original feature space into a new space

Classify in the new space

5

XOR Problem

Example of Multilayer Perceptron as a solution

6

XOR Problem

Multilayer Perceptron (MLP) in Neural Network

To chain together a collection of perceptrons

Two layers (not three layers)

Don’t count the inputs as a real layer

Two layers of trained weights

Each edge corresponds to a different weight

Input -> hidden, hidden ->output

7

Architecture of Multilayer Perceptron

Multilayer Perceptron (MLP) in Neural Network

Input layer, Hidden layer and Output layer

Weights: u and v

8

Architecture of Multilayer Perceptron

Functions in MLP

9

Forward Computation

Other Understanding of MLP Forward Propagation

10

Forward Computation

11

Activation Functions

Hyperbolic tangent function

Popular link function

Differential: its derivative is 1-tanh2(x)

Sigmoid functions

12

Activation Functions

Simple Two-layer MLP

13

Activation Functions

Small Two-layer Perceptron to solve the XOR problem

14

Activation Functions

-1 1

-1

15

Learning in MLP

16

Learning in MLP

zj

17

Learning in MLP

Back-propagation Algorithm

18

Learning in MLP

19

Learning in MLP

zj

20

Learning in MLP

xj

Understanding back-propagation on a simple example

Two layers MLP and No activation function in the output layer

21

Learning in MLP

f3(s)

x1

x2

f4(s)

f5(s)

f2(s)

v1

v2

w1

w2

w3

f1(s)

Understanding back-propagation on a simple example

22

Learning in MLP

Understanding back-propagation on a simple example

23

Learning in MLP

f3(s)

x1

x2

f4(s)

f5(s)

f2(s) v2

w1

w2

w3

f1(s)

e1

e2

e3 e3 = v13e1 + v23e2

Understanding back-propagation on a simple example

24

Learning in MLP

f3(s)

x1

x2

f4(s)

f5(s)

f2(s) v2

w1

w2

w3

f1(s)

e1

e2

e3 e4 = v14e1 + v24e2

e4

Understanding back-propagation on a simple example

25

Learning in MLP

f3(e)

x1

x2

f4(e)

f5(e)

f2(e) v2

w2

w3

f1(e)

e1

e2

e3

e4

e5

Understanding back-propagation on a simple example

26

Learning in MLP

f3(e)

x1

x2

f4(e)

f5(e)

f2(e) v2

w1

w2

w3

f1(e)

e1

e2

e3

e4

e5

Back-propagation

Algorithm

27

Learning in MLP

Simple Example in MLP Learning

28

Learning in MLP

29

Learning in MLP

Back-propagation

Line 8:

Line 9:

Line 10:

30

Learning in MLP

Back-propagation

Line 11:

Line 12: Line 13:

31

Learning in MLP

32

Learning in MLP

Typical complaints

# of layers

# of hidden units per layer

The gradient descent learning rate

The initialization

the stopping iteration or weight regularization

Random Initialization

Small random weights (say, uniform between -0.1 and 0.1)

By training a collection of networks, each with a different random

initialization, we can often obtain better solutions

33

More Considerations

significant

Initialization Tip

34

More Considerations

out

When is the proper number of iteration for early stopping?

35

More Considerations

36

Python Code and Practice

You should install Python 2.7 and Numpy

Download from: http://nlpmlir.blogspot.kr/2016/02/multilayer-

perceptron.html

Homework

37

• 오일석. 패턴인식. 교보문고.

• Sangkeun Jung. “Introduction to Deep Learning.” Natural Language Processing Tutorial,

2015.

• http://ciml.info/

References

Thank you for your attention!

http://web.donga.ac.kr/yjko/

고 영 중

top related