associative memory by recurrent neural networks with delay elements seiji miyoshi hiro-fumi yanai...
TRANSCRIPT
![Page 1: Associative Memory by Recurrent Neural Networks with Delay Elements Seiji MIYOSHI Hiro-Fumi YANAI Masato OKADA Kobe City College of Tech. Ibaraki Univ](https://reader036.vdocuments.net/reader036/viewer/2022081513/56649ea45503460f94ba867a/html5/thumbnails/1.jpg)
Associative Memory by Recurrent Neural Networks with Delay Elements
Seiji MIYOSHI Hiro-Fumi YANAI Masato OKADAKobe City College of Tech. Ibaraki Univ. RIKEN BSI , ERATO KDB
JAPAN JAPAN JAPAN
www.kobe-kosen.ac.jp/~miyoshi/
![Page 2: Associative Memory by Recurrent Neural Networks with Delay Elements Seiji MIYOSHI Hiro-Fumi YANAI Masato OKADA Kobe City College of Tech. Ibaraki Univ](https://reader036.vdocuments.net/reader036/viewer/2022081513/56649ea45503460f94ba867a/html5/thumbnails/2.jpg)
Background• Synapses of real neural systems seem to have delays.
• It is very important to analyze associative memory model with delayed synapses.
• Computer simulation is powerful method.
• Theoretical and analytical approach is indispensable to research on delayed networks.
There is a Limit on the number of neurons.However,
• Yanai-Kim theory by using Statistical Neurodynamics
Good Agreement with computer simulationComputational Complexity is O(L4t)
Simulating network with large delay steps is realistically impossible.
![Page 3: Associative Memory by Recurrent Neural Networks with Delay Elements Seiji MIYOSHI Hiro-Fumi YANAI Masato OKADA Kobe City College of Tech. Ibaraki Univ](https://reader036.vdocuments.net/reader036/viewer/2022081513/56649ea45503460f94ba867a/html5/thumbnails/3.jpg)
Objective
• To derive macroscopic steady state equations by using discrete Fourier transformation
• To discuss storage capacity quantitatively even for a large L limit (L: length of delay)
![Page 4: Associative Memory by Recurrent Neural Networks with Delay Elements Seiji MIYOSHI Hiro-Fumi YANAI Masato OKADA Kobe City College of Tech. Ibaraki Univ](https://reader036.vdocuments.net/reader036/viewer/2022081513/56649ea45503460f94ba867a/html5/thumbnails/4.jpg)
Recurrent Neural Network with Delay Elements
Delay E lement1 l L-1
J
J
ij ij ij
ji ji ji ji
ijL-1
L-1
l
l
1
1
0
0
J
J
J
J
J
J
Neuron
i
N
1
j
Model
![Page 5: Associative Memory by Recurrent Neural Networks with Delay Elements Seiji MIYOSHI Hiro-Fumi YANAI Masato OKADA Kobe City College of Tech. Ibaraki Univ](https://reader036.vdocuments.net/reader036/viewer/2022081513/56649ea45503460f94ba867a/html5/thumbnails/5.jpg)
• Overlap
Model
• Discrete Synchronous Updating Rule
• Correlation Learning for Sequence Processing
![Page 6: Associative Memory by Recurrent Neural Networks with Delay Elements Seiji MIYOSHI Hiro-Fumi YANAI Masato OKADA Kobe City College of Tech. Ibaraki Univ](https://reader036.vdocuments.net/reader036/viewer/2022081513/56649ea45503460f94ba867a/html5/thumbnails/6.jpg)
Macrodynamical Equations by Statistical NeurodynamicsYanai & Kim(1995) Miyoshi, Yanai & Okada(2002)
![Page 7: Associative Memory by Recurrent Neural Networks with Delay Elements Seiji MIYOSHI Hiro-Fumi YANAI Masato OKADA Kobe City College of Tech. Ibaraki Univ](https://reader036.vdocuments.net/reader036/viewer/2022081513/56649ea45503460f94ba867a/html5/thumbnails/7.jpg)
Initial Condition of the NetworkOne Step Set Initial Condition• Only the states of neurons are
set explicitly.
• The states of delay elements are set to be zero.
All Steps Set Initial Condition• The states of all neurons and all
delay elements are set to be close to the stored pattern sequences.
• If they are set to be the stored pattern sequences themselves
≡ Optimum Initial Condition
NeuronDelay E lement
i
N
1
1 l L-1
j
J
J
ij ij ij
ji ji ji ji
ijL-1
L-1
l
l
1
1
0
0
J
J
J
J
J
J
set zero
NeuronDelay E lement
i
N
1
1 l L-1
j
J
J
ij ij ij
ji ji ji ji
ijL-1
L-1
l
l
1
1
0
0
J
J
J
J
J
J
set
![Page 8: Associative Memory by Recurrent Neural Networks with Delay Elements Seiji MIYOSHI Hiro-Fumi YANAI Masato OKADA Kobe City College of Tech. Ibaraki Univ](https://reader036.vdocuments.net/reader036/viewer/2022081513/56649ea45503460f94ba867a/html5/thumbnails/8.jpg)
Dynamical Behaviors of Recall Processo
ver
lap
m
1
0.8
0.6
0.4
0.2
0
time step t
0 5 10 15 20 25 30
time step t
0 5 10 15 20 25 30
ov
erla
p
m
1
0.8
0.6
0.4
0.2
0
All Steps Set Intial Condition Loading rateα=0.5
Length of delay L=3
Simulation( N=2000)
Theory
![Page 9: Associative Memory by Recurrent Neural Networks with Delay Elements Seiji MIYOSHI Hiro-Fumi YANAI Masato OKADA Kobe City College of Tech. Ibaraki Univ](https://reader036.vdocuments.net/reader036/viewer/2022081513/56649ea45503460f94ba867a/html5/thumbnails/9.jpg)
time step t
ov
erla
p
m
1
0.8
0.6
0.4
0.2
0
0 5 10 15 20 25 30
time step t
ov
erla
p
m
1
0.8
0.6
0.4
0.2
0
0 5 10 15 20 25 30
Dynamical Behaviors of Recall Process
All Steps Set Intial Condition Loading rateα=0.5
Length of delay L=2
Simulation( N=2000)
Theory
![Page 10: Associative Memory by Recurrent Neural Networks with Delay Elements Seiji MIYOSHI Hiro-Fumi YANAI Masato OKADA Kobe City College of Tech. Ibaraki Univ](https://reader036.vdocuments.net/reader036/viewer/2022081513/56649ea45503460f94ba867a/html5/thumbnails/10.jpg)
Loading rates α - Steady State Overlaps m
Simulation( N=500)
Ov
erla
p
m
1
0
0.2
0
L=1 L=3
One Step Set Optimum
L=10
L=10
0.5 1 1.5 2
0.4
0.6
0.8
αLoading RateO
ver
lap
m
1
0
0.2
0.4
0.6
0.8
0 0.5 1 1.5 2
αLoading Rate
L=3
One Step Set Optimum
L=10L=10L=1
Theory
![Page 11: Associative Memory by Recurrent Neural Networks with Delay Elements Seiji MIYOSHI Hiro-Fumi YANAI Masato OKADA Kobe City College of Tech. Ibaraki Univ](https://reader036.vdocuments.net/reader036/viewer/2022081513/56649ea45503460f94ba867a/html5/thumbnails/11.jpg)
Length of delay L - Critical Loading Rate αC
Cri
tica
l Lo
ad
ing
Ra
te
1
2
0.21 2 3 4 5 6 7 8 9 10
2.2
1.2
0.4
1.4
0.6
1.6
0.8
1.8
Length of Delay L
Optimum
One Step Set
![Page 12: Associative Memory by Recurrent Neural Networks with Delay Elements Seiji MIYOSHI Hiro-Fumi YANAI Masato OKADA Kobe City College of Tech. Ibaraki Univ](https://reader036.vdocuments.net/reader036/viewer/2022081513/56649ea45503460f94ba867a/html5/thumbnails/12.jpg)
Macrodynamical Equations by Statistical NeurodynamicsYanai & Kim(1995) Miyoshi, Yanai & Okada(2002)
• Good Agreement with Computer Simulation
• Computational Complexity is O(L4t)
![Page 13: Associative Memory by Recurrent Neural Networks with Delay Elements Seiji MIYOSHI Hiro-Fumi YANAI Masato OKADA Kobe City College of Tech. Ibaraki Univ](https://reader036.vdocuments.net/reader036/viewer/2022081513/56649ea45503460f94ba867a/html5/thumbnails/13.jpg)
Macroscopic Steady State Equations• Accounting for Steady State• Parallel Symmetry in terms of Time Steps• Discrete Fourier Transformation
![Page 14: Associative Memory by Recurrent Neural Networks with Delay Elements Seiji MIYOSHI Hiro-Fumi YANAI Masato OKADA Kobe City College of Tech. Ibaraki Univ](https://reader036.vdocuments.net/reader036/viewer/2022081513/56649ea45503460f94ba867a/html5/thumbnails/14.jpg)
Loading rates α - Steady State Overlaps m
ov
erla
p
m
1
0.8
0.6
0.4
0.2
0
0.1 1 10 100
L=1
1000 10000
αLoading Rate
ov
erla
p
m
1
0.8
0.6
0.4
0.2
0
0.1 1 10 100
L= 1L= 2
1000 10000
αLoading Rate
ov
erla
p
m
1
0.8
0.6
0.4
0.2
0
0.1 1 10 100
L= 1L= 2L= 3
1000 10000
αLoading Rate
ov
erla
p
m
1
0.8
0.6
0.4
0.2
0
0.1 1 10 100
L= 1L= 2L= 3
L= 10
1000 10000
αLoading Rate
ov
erla
p
m
1
0.8
0.6
0.4
0.2
0
0.1 1 10 100
L= 1L= 2L= 3
L= 10L=100
1000 10000
αLoading Rate
ov
erla
p
m
1
0.8
0.6
0.4
0.2
0
0.1 1 10 100
L= 1L= 2L= 3
L= 10L=100
L= 1000
1000 10000
αLoading Rate
ov
erla
p
m
1
0.8
0.6
0.4
0.2
0
0.1 1 10 100
L= 1L= 2L= 3
L= 10L=100
L= 1000L= 10000
1000 10000
αLoading Rate
ov
erla
p
m
1
0.8
0.6
0.4
0.2
0
0.1 1 10 100
L= 1L= 2L= 3
L= 10L=100
L= 1000L= 10000
L=100000
1000 10000
αLoading Rate
![Page 15: Associative Memory by Recurrent Neural Networks with Delay Elements Seiji MIYOSHI Hiro-Fumi YANAI Masato OKADA Kobe City College of Tech. Ibaraki Univ](https://reader036.vdocuments.net/reader036/viewer/2022081513/56649ea45503460f94ba867a/html5/thumbnails/15.jpg)
Loading rates α - Steady State Overlaps m
Simulation( N=500)
Ov
erla
p
m
1
0
0.2
0
L=1 L=3
One Step Set Optimum
L=10
L=10
0.5 1 1.5 2
0.4
0.6
0.8
αLoading RateO
ver
lap
m
1
0
0.2
0.4
0.6
0.8
0 0.5 1 1.5 2
αLoading Rate
L=3
One Step Set Optimum
L=10L=10L=1
Theory
![Page 16: Associative Memory by Recurrent Neural Networks with Delay Elements Seiji MIYOSHI Hiro-Fumi YANAI Masato OKADA Kobe City College of Tech. Ibaraki Univ](https://reader036.vdocuments.net/reader036/viewer/2022081513/56649ea45503460f94ba867a/html5/thumbnails/16.jpg)
Loading rate α - Steady State Overlap
ov
erla
p
m
1
0.8
0.6
0.4
0.2
0
0.1 1 10 100
L= 1L= 2L= 3
L= 10L=100
L= 1000L= 10000
L=100000
1000 10000
αLoading Rate
![Page 17: Associative Memory by Recurrent Neural Networks with Delay Elements Seiji MIYOSHI Hiro-Fumi YANAI Masato OKADA Kobe City College of Tech. Ibaraki Univ](https://reader036.vdocuments.net/reader036/viewer/2022081513/56649ea45503460f94ba867a/html5/thumbnails/17.jpg)
Storage Capacity of Delayed Network
Sto
rag
e C
ap
ac
ity
αC
Number of Delays L
1
10.1
10
10
100
100
1000
1000
10000
10000 100000
100000
Storage Capacity = 0.195 L
![Page 18: Associative Memory by Recurrent Neural Networks with Delay Elements Seiji MIYOSHI Hiro-Fumi YANAI Masato OKADA Kobe City College of Tech. Ibaraki Univ](https://reader036.vdocuments.net/reader036/viewer/2022081513/56649ea45503460f94ba867a/html5/thumbnails/18.jpg)
Conclusions• Yanai-Kim theory (macrodynamical equations for del
ayed network) is re-derived.
• Steady state equations are derived by using discrete Fourier transformation.
• Storage capacity is 0.195 L in a large L limit.
→ Computational Complexity is O(L4t)
→ Intractable to discuss macroscopic properties in a large L limit
→ Computational complexity does not formally depend on L→ Phase transition points agree with those under the optimum initial conditions, that is, the Storage Capacities !