data acquistion preprocessing prediction...
TRANSCRIPT
![Page 1: Data Acquistion Preprocessing Prediction modelmblachnik.pl/lib/exe/fetch.php/dydaktyka/zajecia/ai/wyklady/03_sieci... · The entire network can be expressed with the following formula](https://reader034.vdocuments.net/reader034/viewer/2022052013/6029f90171b214669f1d2705/html5/thumbnails/1.jpg)
Data Acquistion Preprocessing Prediction model
![Page 2: Data Acquistion Preprocessing Prediction modelmblachnik.pl/lib/exe/fetch.php/dydaktyka/zajecia/ai/wyklady/03_sieci... · The entire network can be expressed with the following formula](https://reader034.vdocuments.net/reader034/viewer/2022052013/6029f90171b214669f1d2705/html5/thumbnails/2.jpg)
ANN are universal approximation
- There is a theorem which says that any multidimensional function can be approximated with the ANN
y = f(x) ◦ Balázs Csanád Csáji (2001) Approximation with Artificial Neural Networks; Faculty of Sciences; Eötvös Loránd University,
Hungary
◦ Cybenko, G. (1989) "Approximations by superpositions of sigmoidal functions", Mathematics of Control, Signals, and Systems, 2 (4), 303-314
◦ Kurt Hornik (1991) "Approximation Capabilities of Multilayer Feedforward Networks", Neural Networks, 4(2), 251–257
- The problem is the structure of the ANN – how many neurons do we need?
- How the neurons should be connected?
- How the neurons should be trained?
- How to set-up other parameters like learning rate etc.
![Page 3: Data Acquistion Preprocessing Prediction modelmblachnik.pl/lib/exe/fetch.php/dydaktyka/zajecia/ai/wyklady/03_sieci... · The entire network can be expressed with the following formula](https://reader034.vdocuments.net/reader034/viewer/2022052013/6029f90171b214669f1d2705/html5/thumbnails/3.jpg)
Source: https://viblo.asia/p/overview-of-artificial-neural-networks-and-its-applications-ORNZqwQb50n
![Page 4: Data Acquistion Preprocessing Prediction modelmblachnik.pl/lib/exe/fetch.php/dydaktyka/zajecia/ai/wyklady/03_sieci... · The entire network can be expressed with the following formula](https://reader034.vdocuments.net/reader034/viewer/2022052013/6029f90171b214669f1d2705/html5/thumbnails/4.jpg)
McCullocha i Pittsa:
Where: wi – i-th weight xi – i-th input (dendryt) z – neuron output (akson) b – initial activation (bias)
![Page 5: Data Acquistion Preprocessing Prediction modelmblachnik.pl/lib/exe/fetch.php/dydaktyka/zajecia/ai/wyklady/03_sieci... · The entire network can be expressed with the following formula](https://reader034.vdocuments.net/reader034/viewer/2022052013/6029f90171b214669f1d2705/html5/thumbnails/5.jpg)
Where: wi – i-th weight xi – i-th input (dendryt) z – neuron output (akson) b – initial activation (bias) f – nonlinear activation function
![Page 6: Data Acquistion Preprocessing Prediction modelmblachnik.pl/lib/exe/fetch.php/dydaktyka/zajecia/ai/wyklady/03_sieci... · The entire network can be expressed with the following formula](https://reader034.vdocuments.net/reader034/viewer/2022052013/6029f90171b214669f1d2705/html5/thumbnails/6.jpg)
Binary: unii-polar or bi-polar (Rosenblatt’s perceptron)
Continues: Sigmoidal
Rectifier
𝑓 𝑧 = 𝑧+ = max (0, 𝑧)
𝑓 𝑧 = 𝑧+ = log (1 + exp (𝑥))
![Page 7: Data Acquistion Preprocessing Prediction modelmblachnik.pl/lib/exe/fetch.php/dydaktyka/zajecia/ai/wyklady/03_sieci... · The entire network can be expressed with the following formula](https://reader034.vdocuments.net/reader034/viewer/2022052013/6029f90171b214669f1d2705/html5/thumbnails/7.jpg)
Radial neurons
◦ Gaussian
◦ Polynomial
◦ Hardys
Where
2 2( )f z z
2 2
1( )f z
z
2
2( ) exp
2
zf z
2 2( )f z z
2
np. i i
i
z x t z = x - t
![Page 8: Data Acquistion Preprocessing Prediction modelmblachnik.pl/lib/exe/fetch.php/dydaktyka/zajecia/ai/wyklady/03_sieci... · The entire network can be expressed with the following formula](https://reader034.vdocuments.net/reader034/viewer/2022052013/6029f90171b214669f1d2705/html5/thumbnails/8.jpg)
Architecture types:
Feed-forward networks – the signal flows in only one direction (when predicting)
Recurrent networks – the signal is looped from the output back to the input
Cellular networks
![Page 9: Data Acquistion Preprocessing Prediction modelmblachnik.pl/lib/exe/fetch.php/dydaktyka/zajecia/ai/wyklady/03_sieci... · The entire network can be expressed with the following formula](https://reader034.vdocuments.net/reader034/viewer/2022052013/6029f90171b214669f1d2705/html5/thumbnails/9.jpg)
Input layer
First hidden layer
H’th hidden layer
Output layer
i-number of elements in input layer (i=1:m) j – number of elements in output layer (j=1:n) h – number of the hidden neurons h=1:H - - Weight of the connection between elements kh-1 and kh
respectively between layer (h-1) and h
![Page 10: Data Acquistion Preprocessing Prediction modelmblachnik.pl/lib/exe/fetch.php/dydaktyka/zajecia/ai/wyklady/03_sieci... · The entire network can be expressed with the following formula](https://reader034.vdocuments.net/reader034/viewer/2022052013/6029f90171b214669f1d2705/html5/thumbnails/10.jpg)
The output of a single layer is:
The entire network can be expressed with the following formula
Where: Nh(uh) – Neural processing operator of h’th layer Wh – A weight matrix between layers (h-1) and (h)
Where: y – the output vector y=[y1, y2, … yn]
T u – the input vector u=[u1, u2, … um]T
Fwyj – activation function of the output layer Nwyj – Neural processing operator of the output layer Wwyj – A weight matrix between layers: H and output layer
![Page 11: Data Acquistion Preprocessing Prediction modelmblachnik.pl/lib/exe/fetch.php/dydaktyka/zajecia/ai/wyklady/03_sieci... · The entire network can be expressed with the following formula](https://reader034.vdocuments.net/reader034/viewer/2022052013/6029f90171b214669f1d2705/html5/thumbnails/11.jpg)
N1 – N4 – Blocks represent group of neurons
![Page 12: Data Acquistion Preprocessing Prediction modelmblachnik.pl/lib/exe/fetch.php/dydaktyka/zajecia/ai/wyklady/03_sieci... · The entire network can be expressed with the following formula](https://reader034.vdocuments.net/reader034/viewer/2022052013/6029f90171b214669f1d2705/html5/thumbnails/12.jpg)
Unrolling recurrent neural network
Source: http://colah.github.io/posts/2015-08-Understanding-LSTMs/
The history
Current input
The output depends on the history and current state
![Page 13: Data Acquistion Preprocessing Prediction modelmblachnik.pl/lib/exe/fetch.php/dydaktyka/zajecia/ai/wyklady/03_sieci... · The entire network can be expressed with the following formula](https://reader034.vdocuments.net/reader034/viewer/2022052013/6029f90171b214669f1d2705/html5/thumbnails/13.jpg)
Eg. the Hopfield's Network
Where: k – the number of the following iteration bi – the external control signal wij – connection weights between input and output
![Page 14: Data Acquistion Preprocessing Prediction modelmblachnik.pl/lib/exe/fetch.php/dydaktyka/zajecia/ai/wyklady/03_sieci... · The entire network can be expressed with the following formula](https://reader034.vdocuments.net/reader034/viewer/2022052013/6029f90171b214669f1d2705/html5/thumbnails/14.jpg)
RNN – had problems with long term dependences – couldn't remember long term relations, but the Long Term Short Term Memory (LTSTM)
The output of current neuron
Forgot layer
What new information to store in cell
![Page 15: Data Acquistion Preprocessing Prediction modelmblachnik.pl/lib/exe/fetch.php/dydaktyka/zajecia/ai/wyklady/03_sieci... · The entire network can be expressed with the following formula](https://reader034.vdocuments.net/reader034/viewer/2022052013/6029f90171b214669f1d2705/html5/thumbnails/15.jpg)
Neighbor neurons are connected Examples: mapy Kochonena, LVQ