soft computing colloquium 2 selection of neural network, hybrid neural networks
TRANSCRIPT
Soft Computing
Colloquium 2
Selection of neural network,
Hybrid neural networks.
14.11.2005 2
Objectives
• Why too much of models of neural networks (NN)?
• Classes of tasks and classes of NN
• Hybrid neural networks
• Hybrid model based on MLP and ART-2
• Paths to improvement of neural networks
14.11.2005 3
Submit a questions to discuss
• Paths to improvement of neural networks:– Development of growth neural networks with
feedback and delays– Development of theory of spiking neurons and
building of associative memory based on its– Development of neural network in which
during learning logical (verbal) inference would appearance from associative memory
14.11.2005 4
Why too much of models of neural networks (NN)?
Models of neural networkssimulate separate aspectsof working of brain (e.g. associative memory, buthow it works in whole isunknown for us.Questions:1) What is consciousness?2) What is role of emotions?3) How different areas ofbrain are coordinated?4) How associative linksare transformed and used inlogical inference and calculations?
14.11.2005 5
14.11.2005 6
Classes of tasks :
• prediction
• classification
• data association
• data conceptualization
• data filtering
• Neuromathematics
14.11.2005 7
Classes of Neural Networks:• Multi Layer Networks
– Multi Layer Perceptron (MLP)• Supervised learning
– Radial Basis Functions (RBF-networks)
• Supervised learning– Recurrent Neural Networks
(Elman, Jordan)• Supervised learning• Reinforcement learning
– Counterpropagation network• Supervised learning
• One-layer networks– Self-organized map (MAP)
• Unsupervised learning– Artificial resonance theory
(ART)• Unsupervised learning
– Hamming network• Supervised learning
• Fully interconnected networks– Hopfield network
• Supervised learning– Boltzmann machine
• Supervised learning– Bi-directional associative
memory• Supervised learning
• Spiking networks• Supervised learning• Unsupervised learning• Reinforcement learning
14.11.2005 8
Counterpropagation network
14.11.2005 9
Network Selector Table
14.11.2005 10
Hybrid Neural Networks.• Includes:
– Main neural network– Other neural network
• Preprocessing• Postprocessing
• Some models of neural networks consist of some layers working by different manner and so such neural networks may be viewed as hybrid neural networks (including more elementary networks)
• Some authors calls hybrid neural networks such model which combine paradigms of neural networks and knowledge engineering.
14.11.2005 11
Hybrid Neural Network based on models of Multi-Layer Perceptron and Adaptive Resonance Theory (A.Gavrilov, 2005)
• Aims to keep capabilities of ARM (plasticity and stability)
• Include in ART capabilities of MLP during learning to obtain complex secondary features from primary features (to approximate any function)
14.11.2005 12
Disadvantages of model ART-2 for recognition of images
• It uses of metrics of primary features of images to recognize of class or create of new class,
• Transformations of graphic images (shift or rotation or others) essentially influence on distance between input vectors
• So it is unsuitable for control system of a mobile robots
14.11.2005 13
Architecture of hybrid neural networkoutput vector
output layer:clusters
input layer:input variables
y1 y2 ym
input layer ofART-2, output
layer of perceptron
hidden layer ofperceptron
input vector
x1 x2 xn
14.11.2005 14
Algorithm of learning without teacher
• Set of initial weights of neurons; Nout:=0;
• Input of image-example and calculate of outputs of perceptron;
• If Nout=0 then forming of new cluster-output neuron;
• If Nout>0 then calculate of distances between weight vector of ART-2 and output vector of perceptron, select of minimum of them (selection of output neuron-winner) and decide to create or not new cluster;
• If new cluster is not created then calculate new values of weights of output neuron-winner and calculate new weights of perceptron with algorithm “error back propagation”.
14.11.2005 15
The illustration of algorithm
1 3
4 2
5
R1
14.11.2005 16
Images and parameters used in experiments
Quantity of input neurons (pixels) - 10000 (100х100),Quantity of neurons in hidden layer of perceptron - 20,Quantity of output neurons of perceptron (in input layer of ART-2) Nout - 10,Radius of cluster R was used in experiments in different manners:
1) adapt and fix,2) calculate for every image by formulas S/(2Nout),
where S – average input signal, Nout – number of output neurons of perceptron,3) calculated as 2Dmin,where Dmin – minimal distance between input vector of ART2 and weight vectors in previous image.Activation function of neurons of perceptron is rational sigmoid with parameter a=1,Value of learning step of perceptron is 1,Number of iterations of recalculation of weights of perceptron is from 1 to 10.
1) 2) 3)
14.11.2005 17
Series of images 1
14.11.2005 18
Program for experiments
14.11.2005 19
For sequence of images of series 1, 2, 1, 2 (a dark points are corresponding to 2nd kind of calculation
of vigilance and light – to 1st one).
Number of recognized cluster
02468
1012141618
1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61
image
nu
mb
er
14.11.2005 20
For sequence of images of series 1 at different
number of iteration of EBP algorithm: 1, 3, 5, 7, 9.
Distance between output vector of MLP and centroid of recognized cluster
0
0,001
0,002
0,003
0,004
1 2 3 4 5 6 7 8 9
image
Dis
tan
ce
14.11.2005 21
Paths to improvement of neural networks
• Development of growth neural networks with feedback and delays
• Development of theory of spiking neural networks and building of associative memory based on them
• Development of neural network in which during learning logical (verbal) inference would appearance from associative memory