Neural coding: a perspective for new associativememories
Vincent Gripon
Télécom Bretagne
February 26, 2012
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 1 / 15
Plan
1 IntroductionChallengeApproach
2 Our modelPrinciplePerformanceApplication example
3 Direction: learning sequences
4 Openings
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 2 / 15
Challenge
ContextExponential growth of the amount of information
Challenge
User Difficult to find the desired piece of information.
EngineerAlgorithms complexity is limiting,
Ad-hoc solutions are often required.
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 3 / 15
Challenge
ContextExponential growth of the amount of information
Challenge
User Difficult to find the desired piece of information.
EngineerAlgorithms complexity is limiting,
Ad-hoc solutions are often required.
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 3 / 15
Approach
DefinitionAssociative memory = device that can retrieve previously learnedmessages from part of them.
Idea
Error correctingcodes
NeurobiologyAssociative
memory
Erasure channel Cognitive similarities
Distributed
Correction
Message passing
Sparsity
Clusters
Simplicity
Efficient&
Simple
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 4 / 15
Approach
DefinitionAssociative memory = device that can retrieve previously learnedmessages from part of them.
Idea
Error correctingcodes
NeurobiologyAssociative
memory
Erasure channel Cognitive similarities
Distributed
Correction
Message passing
Sparsity
Clusters
Simplicity
Efficient&
Simple
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 4 / 15
Approach
DefinitionAssociative memory = device that can retrieve previously learnedmessages from part of them.
Idea
Error correctingcodes
NeurobiologyAssociative
memory
Erasure channel Cognitive similarities
Distributed
Correction
Message passing
Sparsity
Clusters
Simplicity
Efficient&
Simple
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 4 / 15
Approach
DefinitionAssociative memory = device that can retrieve previously learnedmessages from part of them.
Idea
Error correctingcodes
NeurobiologyAssociative
memory
Erasure channel Cognitive similarities
Distributed
Correction
Message passing
Sparsity
Clusters
Simplicity
Efficient&
Simple
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 4 / 15
Approach
DefinitionAssociative memory = device that can retrieve previously learnedmessages from part of them.
Idea
Error correctingcodes
NeurobiologyAssociative
memory
Erasure channel Cognitive similarities
Distributed
Correction
Message passing
Sparsity
Clusters
Simplicity
Efficient&
Simple
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 4 / 15
Approach
DefinitionAssociative memory = device that can retrieve previously learnedmessages from part of them.
Idea
Error correctingcodes
NeurobiologyAssociative
memory
Erasure channel Cognitive similarities
Distributed
Correction
Message passing
Sparsity
Clusters
Simplicity
Efficient&
Simple
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 4 / 15
Approach
DefinitionAssociative memory = device that can retrieve previously learnedmessages from part of them.
Idea
Error correctingcodes
NeurobiologyAssociative
memory
Erasure channel Cognitive similarities
Distributed
Correction
Message passing
Sparsity
Clusters
Simplicity
Efficient&
Simple
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 4 / 15
Approach
DefinitionAssociative memory = device that can retrieve previously learnedmessages from part of them.
Idea
Error correctingcodes
NeurobiologyAssociative
memory
Erasure channel Cognitive similarities
Distributed
Correction
Message passing
Sparsity
Clusters
Simplicity
Efficient&
Simple
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 4 / 15
Approach
DefinitionAssociative memory = device that can retrieve previously learnedmessages from part of them.
Idea
Error correctingcodes
NeurobiologyAssociative
memory
Erasure channel Cognitive similarities
Distributed
Correction
Message passing
Sparsity
Clusters
Simplicity
Efficient&
Simple
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 4 / 15
Approach
DefinitionAssociative memory = device that can retrieve previously learnedmessages from part of them.
Idea
Error correctingcodes
NeurobiologyAssociative
memory
Erasure channel Cognitive similarities
Distributed
Correction
Message passing
Sparsity
Clusters
Simplicity
Efficient&
Simple
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 4 / 15
Approach
DefinitionAssociative memory = device that can retrieve previously learnedmessages from part of them.
Idea
Error correctingcodes
NeurobiologyAssociative
memory
Erasure channel Cognitive similarities
Distributed
Correction
Message passing
Sparsity
Clusters
Simplicity
Efficient&
Simple
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 4 / 15
Outline
1 IntroductionChallengeApproach
2 Our modelPrinciplePerformanceApplication example
3 Direction: learning sequences
4 Openings
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 5 / 15
Plan
1 IntroductionChallengeApproach
2 Our modelPrinciplePerformanceApplication example
3 Direction: learning sequences
4 Openings
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 6 / 15
Our model: learning
Example: c = 4 clusters made of l = 16 neurons each,Learn 16-ary message: 8︸︷︷︸
j1 in c1
3︸︷︷︸j2 in c2
2︸︷︷︸j3 in c3
9︸︷︷︸j4 in c4
,
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 7 / 15
Our model: learning
Example: c = 4 clusters made of l = 16 neurons each,Learn 16-ary message: 8︸︷︷︸
j1 in c1
3︸︷︷︸j2 in c2
2︸︷︷︸j3 in c3
9︸︷︷︸j4 in c4
,
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 7 / 15
Our model: learning
Example: c = 4 clusters made of l = 16 neurons each,Learn 16-ary message: 8︸︷︷︸
j1 in c1
3︸︷︷︸j2 in c2
2︸︷︷︸j3 in c3
9︸︷︷︸j4 in c4
,
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 7 / 15
Our model: learning
Example: c = 4 clusters made of l = 16 neurons each,Learn 16-ary message: 8︸︷︷︸
j1 in c1
3︸︷︷︸j2 in c2
2︸︷︷︸j3 in c3
9︸︷︷︸j4 in c4
,
j1
j2
j3
j4
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 7 / 15
Our model: learning
Example: c = 4 clusters made of l = 16 neurons each,Learn 16-ary message: 8︸︷︷︸
j1 in c1
3︸︷︷︸j2 in c2
2︸︷︷︸j3 in c3
9︸︷︷︸j4 in c4
,
j1
j2
j3
j4
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 7 / 15
Our model: retrieving
8︸︷︷︸j1 in c1
3︸︷︷︸j2 in c2
2︸︷︷︸j3 in c3
?
Projection to thenetwork,
Global decoding: sum,
Local decoding:winner-take-all,
Possibly iterate thetwo decodings.
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 8 / 15
Our model: retrieving
8︸︷︷︸j1 in c1
3︸︷︷︸j2 in c2
2︸︷︷︸j3 in c3
?
j1
j2
j3
Projection to thenetwork,
Global decoding: sum,
Local decoding:winner-take-all,
Possibly iterate thetwo decodings.
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 8 / 15
Our model: retrieving
8︸︷︷︸j1 in c1
3︸︷︷︸j2 in c2
2︸︷︷︸j3 in c3
?
j1
j2
j3
Projection to thenetwork,
Global decoding: sum,
Local decoding:winner-take-all,
Possibly iterate thetwo decodings.
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 8 / 15
Our model: retrieving
8︸︷︷︸j1 in c1
3︸︷︷︸j2 in c2
2︸︷︷︸j3 in c3
?
1
2
2
2
1 3
1
Projection to thenetwork,
Global decoding: sum,
Local decoding:winner-take-all,
Possibly iterate thetwo decodings.
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 8 / 15
Our model: retrieving
8︸︷︷︸j1 in c1
3︸︷︷︸j2 in c2
2︸︷︷︸j3 in c3
9︸︷︷︸j4 in c4
j1
j2
j3
j4
Projection to thenetwork,
Global decoding: sum,
Local decoding:winner-take-all,
Possibly iterate thetwo decodings.
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 8 / 15
Our model: retrieving
8︸︷︷︸j1 in c1
3︸︷︷︸j2 in c2
2︸︷︷︸j3 in c3
9︸︷︷︸j4 in c4
j1
j2
j3
j4
Projection to thenetwork,
Global decoding: sum,
Local decoding:winner-take-all,
Possibly iterate thetwo decodings.
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 8 / 15
Performance
Associative memory
0
0.2
0.4
0.6
0.8
1
0 10000 20000 30000 40000 50000
Err
or
rate
, density
Amount of learned messages (M)
Correctly retrieving probability (4 iterations simulated)Correctly retrieving probability (1 iteration theoretical)
network density
c = 8, l = 256 neuronseach. Input messages havejust 4 known symbols.
State-of-the-art Hopfield network Our network
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 9 / 15
An example: cache design
RoleSpeed-up the lookup of frequently accessed data.
ContextWith associativity decreases the miss rate,
Current implementations are limited to 8-ways associativity,
Speed and area limitations.
Our proposal
Adaptation of the network:Given the address, find the associated data,
Additionnal error probability: 10−30,
Area consumption reduced by 65%,
Pipelined architecture using a few clock cycles.
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 10 / 15
An example: cache design
RoleSpeed-up the lookup of frequently accessed data.
ContextWith associativity decreases the miss rate,
Current implementations are limited to 8-ways associativity,
Speed and area limitations.
Our proposal
Adaptation of the network:Given the address, find the associated data,
Additionnal error probability: 10−30,
Area consumption reduced by 65%,
Pipelined architecture using a few clock cycles.
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 10 / 15
An example: cache design
RoleSpeed-up the lookup of frequently accessed data.
ContextWith associativity decreases the miss rate,
Current implementations are limited to 8-ways associativity,
Speed and area limitations.
Our proposal
Adaptation of the network:Given the address, find the associated data,
Additionnal error probability: 10−30,
Area consumption reduced by 65%,
Pipelined architecture using a few clock cycles.
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 10 / 15
An example: cache design
RoleSpeed-up the lookup of frequently accessed data.
ContextWith associativity decreases the miss rate,
Current implementations are limited to 8-ways associativity,
Speed and area limitations.
Our proposal
Adaptation of the network:Given the address, find the associated data,
Additionnal error probability: 10−30,
Area consumption reduced by 65%,
Pipelined architecture using a few clock cycles.
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 10 / 15
An example: cache design
RoleSpeed-up the lookup of frequently accessed data.
ContextWith associativity decreases the miss rate,
Current implementations are limited to 8-ways associativity,
Speed and area limitations.
Our proposal
Adaptation of the network:Given the address, find the associated data,
Additionnal error probability: 10−30,
Area consumption reduced by 65%,
Pipelined architecture using a few clock cycles.
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 10 / 15
An example: cache design
RoleSpeed-up the lookup of frequently accessed data.
ContextWith associativity decreases the miss rate,
Current implementations are limited to 8-ways associativity,
Speed and area limitations.
Our proposal
Adaptation of the network:Given the address, find the associated data,
Additionnal error probability: 10−30,
Area consumption reduced by 65%,
Pipelined architecture using a few clock cycles.
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 10 / 15
An example: cache design
RoleSpeed-up the lookup of frequently accessed data.
ContextWith associativity decreases the miss rate,
Current implementations are limited to 8-ways associativity,
Speed and area limitations.
Our proposal
Adaptation of the network:Given the address, find the associated data,
Additionnal error probability: 10−30,
Area consumption reduced by 65%,
Pipelined architecture using a few clock cycles.
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 10 / 15
An example: cache design
RoleSpeed-up the lookup of frequently accessed data.
ContextWith associativity decreases the miss rate,
Current implementations are limited to 8-ways associativity,
Speed and area limitations.
Our proposal
Adaptation of the network:Given the address, find the associated data,
Additionnal error probability: 10−30,
Area consumption reduced by 65%,
Pipelined architecture using a few clock cycles.
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 10 / 15
Plan
1 IntroductionChallengeApproach
2 Our modelPrinciplePerformanceApplication example
3 Direction: learning sequences
4 Openings
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 11 / 15
Direction II: learning arbitrarily long sequences
Performancec = 50 clusters,
l = 256neurons/cluster,
L = 1000 symbolsin sequences,
m = 1823 learnedsequences,
Pe ≤ 0.01.
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 12 / 15
Direction II: learning arbitrarily long sequences
0
Performancec = 50 clusters,
l = 256neurons/cluster,
L = 1000 symbolsin sequences,
m = 1823 learnedsequences,
Pe ≤ 0.01.
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 12 / 15
Direction II: learning arbitrarily long sequences
0
1
Performancec = 50 clusters,
l = 256neurons/cluster,
L = 1000 symbolsin sequences,
m = 1823 learnedsequences,
Pe ≤ 0.01.
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 12 / 15
Direction II: learning arbitrarily long sequences
0
1
2
Performancec = 50 clusters,
l = 256neurons/cluster,
L = 1000 symbolsin sequences,
m = 1823 learnedsequences,
Pe ≤ 0.01.
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 12 / 15
Direction II: learning arbitrarily long sequences
0
1
23
Performancec = 50 clusters,
l = 256neurons/cluster,
L = 1000 symbolsin sequences,
m = 1823 learnedsequences,
Pe ≤ 0.01.
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 12 / 15
Direction II: learning arbitrarily long sequences
0
1
23
4
Performancec = 50 clusters,
l = 256neurons/cluster,
L = 1000 symbolsin sequences,
m = 1823 learnedsequences,
Pe ≤ 0.01.
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 12 / 15
Direction II: learning arbitrarily long sequences
0
1
23
4
5
Performancec = 50 clusters,
l = 256neurons/cluster,
L = 1000 symbolsin sequences,
m = 1823 learnedsequences,
Pe ≤ 0.01.
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 12 / 15
Direction II: learning arbitrarily long sequences
0
1
23
4
5
6
Performancec = 50 clusters,
l = 256neurons/cluster,
L = 1000 symbolsin sequences,
m = 1823 learnedsequences,
Pe ≤ 0.01.
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 12 / 15
Direction II: learning arbitrarily long sequences
0
1
23
4
5
6
7
Performancec = 50 clusters,
l = 256neurons/cluster,
L = 1000 symbolsin sequences,
m = 1823 learnedsequences,
Pe ≤ 0.01.
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 12 / 15
Direction II: learning arbitrarily long sequences
0
1
23
4
5
6
7
8
Performancec = 50 clusters,
l = 256neurons/cluster,
L = 1000 symbolsin sequences,
m = 1823 learnedsequences,
Pe ≤ 0.01.
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 12 / 15
Direction II: learning arbitrarily long sequences
0
1
23
4
5
6
7
8
9
Performancec = 50 clusters,
l = 256neurons/cluster,
L = 1000 symbolsin sequences,
m = 1823 learnedsequences,
Pe ≤ 0.01.
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 12 / 15
Direction II: learning arbitrarily long sequences
0
1
23
4
5
6
7
8
9
10Performance
c = 50 clusters,
l = 256neurons/cluster,
L = 1000 symbolsin sequences,
m = 1823 learnedsequences,
Pe ≤ 0.01.
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 12 / 15
Direction II: learning arbitrarily long sequences
0
1
23
4
5
6
7
8
9
10
11
Performancec = 50 clusters,
l = 256neurons/cluster,
L = 1000 symbolsin sequences,
m = 1823 learnedsequences,
Pe ≤ 0.01.
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 12 / 15
Direction II: learning arbitrarily long sequences
0
1
23
4
5
6
7
8
9
10
11
12
Performancec = 50 clusters,
l = 256neurons/cluster,
L = 1000 symbolsin sequences,
m = 1823 learnedsequences,
Pe ≤ 0.01.
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 12 / 15
Direction II: learning arbitrarily long sequences
0
1
23
4
5
6
7
8
9
10
11
12
13
14
15
16
Performancec = 50 clusters,
l = 256neurons/cluster,
L = 1000 symbolsin sequences,
m = 1823 learnedsequences,
Pe ≤ 0.01.
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 12 / 15
Direction II: learning arbitrarily long sequences
0
1
23
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
Performancec = 50 clusters,
l = 256neurons/cluster,
L = 1000 symbolsin sequences,
m = 1823 learnedsequences,
Pe ≤ 0.01.
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 12 / 15
Direction II: learning arbitrarily long sequences
0
1
23
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
Performancec = 50 clusters,
l = 256neurons/cluster,
L = 1000 symbolsin sequences,
m = 1823 learnedsequences,
Pe ≤ 0.01.
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 12 / 15
Direction II: learning arbitrarily long sequences
0
1
23
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
Performancec = 50 clusters,
l = 256neurons/cluster,
L = 1000 symbolsin sequences,
m = 1823 learnedsequences,
Pe ≤ 0.01.
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 12 / 15
Plan
1 IntroductionChallengeApproach
2 Our modelPrinciplePerformanceApplication example
3 Direction: learning sequences
4 Openings
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 13 / 15
Openings
Openings Lifescience
Spikingneurons
Timesynchro-nization
Cognitiveaspects
Ontology
PertinenceAtten-
tiveness
Metamessages
Comput-ing
aspects
Progra-ming
paradigm
Nano-sciences
Cogniterscom-
putability
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 14 / 15
Thank you for your attention. I am at your disposal if you have anyquestion.
Error correcting code
+
+
+
+
+
+
+
+Σ
Σ
Σ
Σ
Σ
Neocortical network
Vincent Gripon (Télécom Bretagne) Neural coding February 26, 2012 15 / 15