Computaonal+principles+of+ synap(c+memory+ - UHzpkilpat/icmns2017/FusiTutorialSyn... · A. Roxin, S. Fusi, PLoS Comp. Biology 2013 . BEFORE SURGERY AFTER BEFORE SURGERY AFTER
86
Computa(onal principles of synap(c memory Stefano Fusi Columbia University
Episodic: memories of specific events + (me and situa(on in which they occurred (e.g. autobiographical memories)
Seman0c: informa(on about general knowledge (facts or concepts). E.g. the capital of Italy
Skill learning e.g. riding a bicycle
Priming: previous exposure to a sensory s(mulus affects our reac(on (mes to a later s(mula(on
Condi0oning:e.g. saliva(ng when you see your favorite food
Memory consolida(on
Long-‐term memory
Short-‐term memory
time
ξ1 ξ2 ξ3 ξ4 ξ5
Formalizing the memory problem
x=wij (weight of the synapse connecting neuron j to i)
time
Memory strength
Δwij1
wij
ξ1 ξ2 ξ3 ξ4 ξ5
Example: the Hopfield model
time
wij=0
Δwijµ
ξ1 ξµ ... ξp ...
Tracking memory 1
time
wij=0
Δwij1
ξ1 ξ2 ξ3 ... ξp
time
wij=0
Δwij1
ξ1 ξ2 ξ3 ... ξp
wij=0 wij
ξ1 ξ2 ξ3 ... ξp
wij=0 wij
Memories are random and uncorrelated
ξ1 ξ2 ξ3 ... ξp
wij=0 wij
Memories are random and uncorrelated
ξ1 ξ2 ξ3 ... ξp
wij=0 wij
Memories are random and uncorrelated
1
ξ1 ξ2 ξ3 ... ξp
ξ1 ξ2 ξ3 ... ξp
wij=0 wij
Memories are random and uncorrelated
0 1
Memory noise
Memory noise
Memory noise
Noise increases with the number of memories
Hopfield 1982, Amit Gutfreund Sompolinsky, 1985
p (# of memories or time)
Hopfield 1982, Amit Gutfreund Sompolinsky 1985
Blackout Catastrophe
p (# of memories or time)
Why this complexity?
… toward more realis(c synapses
Unbounded
… toward more realis(c synapses
Unbounded
Bounded (binary), offline
learning Sompolinsky 1986
… toward more realis(c synapses
Unbounded
Bounded (binary), offline
learning
Bounded (binary), online
learning
Sompolinsky 1986
time
w
Forgetfulness
time
time
Forgetfulness
Δwij1=+1
wij=0
time
time
Forgetfulness
Δwij1=+1
wij=0 wij=1
time
time
Forgetfulness
Δwij1=+1
wij=0
wij=1
wij=1
wij=1
pre post
Δw=+1
Δw=-1
A learning rule for binary synapses
+1 +1
-1 +1
Δw=-1
Δw=+1
+1 -1
-1 -1
with probability q
Initial signal (starting from equilibrium)
Initial signal (starting from equilibrium)
Probability that Δwij1=wij by chance
Initial signal (starting from equilibrium)
Probability that Δwij1=wij by chance
Probability that initially Δwij1≠wij (by chance, 1/2) and
the weight is flipped by learning (q)
Initial signal (starting from equilibrium)
Probability that Δwij1=wij by chance
Probability that initially Δwij1≠wij (by chance, 1/2) and
the weight is flipped by learning (q)
Signal after t memories
Probability that the synapse is NOT modified
Signal after t memories
Signal after t memories
…
Signal after t memories
…
Noise (at equilibrium)
Signal after p memories
Tsodyks 1988, Amit Fusi 1992, Amit Fusi 1994
Blackout Catastrophe
Bounded synapses Unbounded unrealistic synapses
Fast (large q)
Slow (small q)
SN
R
p
SNR(0) Number of memories
FAST
SLOW
Learning rate
~ 1
0
Multi-state synapses
2 states
Multi-state synapses
2 states
… m states
Fusi, Abbott, Nature Neurosci. 2007
SNR(0) Number of memories
FAST
SLOW
Learning rate
~ 1
0
MULTI-STATE SYNAPSE (BALANCED LTP/LTD)
MULTI-STATE SYNAPSE (IMBALANCED LTP/LTD)
Sparse memories
ξ=1 with probability f
ξ=0 with probability 1-f
pre post
Δw=+1 (LTP) with probability q+
Δw=-1 (LTD) with probability q-
Learning rule
Tsodyks Feigelman 1989, Amit Fusi 1994
if q+=q- then: strong imbalance
Sparse memories
ξ=1 with probability f
ξ=0 with probability 1-f
pre post
Δw=+1 (LTP) with probability q+
Δw=-1 (LTD) with probability q-
Learning rule
q+=q
q-= qf
Tsodyks Feigelman 1989, Amit Fusi 1994
SNR(0) Number of memories
SLOW (DENSE)
Learning rate SPARSE
Amit Fusi , Neural Computation, 1994
A significant improvement, but… 1) Not robust to noise (ξ=0 must be exactly 0). In the presence of noise: p ~ Nsyn
2) The amount of informa(on per memory is significantly smaller (it scales like f) 3) Not scalable (for large Nsyn it is very difficult to readout the relevant info)
Ben Dayan Rubin, Fusi, Frontiers in Comp. Neuroscience 2007
Synapses that are bounded and can modified with limited precision require special machinery for preventing catastrophic forgetting Two important principles to improve performance: 1) Heterogeneity (multiple timescales) 2) Efficient memory transfer
People Theory of memory Daniel Amit Nicolas Brunel Francesco Battaglia Francesco Carusi Walter Senn Larry Abbott Daniel Ben Dayan Rubin Alex Roxin Srdjan Ostojic Marcus Benna