computaonal+principles+of+ synap(c+memory+ - uhzpkilpat/icmns2017/fusitutorialsyn... · a. roxin,...

86
Computa(onal principles of synap(c memory Stefano Fusi Columbia University

Upload: dodan

Post on 22-Oct-2018

219 views

Category:

Documents


0 download

TRANSCRIPT

Computa(onal  principles  of  synap(c  memory  

Stefano  Fusi  Columbia  University  

Defini(on  of  memory  

Defini(on  of  memory  (for  variable  x)  

time

Event A Present:  x=f(A)

What is x?

Long-­‐term  memory  

Declara0ve  (Explicit):  consciously  accessible  

memories  

Non-­‐declara0ve    (Implicit,  Procedural)  

   Episodic:  memories  of  specific  events  +  (me  and  situa(on  in  which  they  occurred  (e.g.  autobiographical  memories)  

 

 Seman0c:  informa(on  about  general  knowledge  (facts  or  concepts).  E.g.  the  capital  of  Italy  

 

 Skill  learning  e.g.  riding  a  bicycle  

 

Priming:  previous  exposure  to  a  sensory  s(mulus  affects  our  reac(on  (mes  to  a  later  s(mula(on    

Condi0oning:e.g.  saliva(ng  when  you  see  your  favorite  food  

 

Memory  consolida(on  

Long-­‐term    memory  

Short-­‐term    memory  

time

ξ1 ξ2 ξ3 ξ4 ξ5

Formalizing  the  memory  problem  

x=wij (weight of the synapse connecting neuron j to i)

time

Memory strength

Δwij1  

wij  

ξ1 ξ2 ξ3 ξ4 ξ5

Example:  the  Hopfield  model  

time

wij=0  

Δwijµ

ξ1 ξµ ... ξp ...

Tracking  memory  1  

time

wij=0  

Δwij1  

ξ1 ξ2 ξ3 ... ξp

time

wij=0  

Δwij1  

ξ1 ξ2 ξ3 ... ξp

wij=0   wij  

ξ1 ξ2 ξ3 ... ξp

wij=0   wij  

Memories are random and uncorrelated

ξ1 ξ2 ξ3 ... ξp

wij=0   wij  

Memories are random and uncorrelated

ξ1 ξ2 ξ3 ... ξp

wij=0   wij  

Memories are random and uncorrelated

1

ξ1 ξ2 ξ3 ... ξp

ξ1 ξ2 ξ3 ... ξp

wij=0   wij  

Memories are random and uncorrelated

0 1

Memory noise

Memory noise

Memory noise

Noise increases with the number of memories

Hopfield 1982, Amit Gutfreund Sompolinsky, 1985

p (# of memories or time)

Hopfield 1982, Amit Gutfreund Sompolinsky 1985

Blackout Catastrophe

p (# of memories or time)

Why this complexity?

…  toward  more  realis(c  synapses  

Unbounded

…  toward  more  realis(c  synapses  

Unbounded

Bounded (binary), offline

learning Sompolinsky 1986

…  toward  more  realis(c  synapses  

Unbounded

Bounded (binary), offline

learning

Bounded (binary), online

learning

Sompolinsky 1986

time

w

Forgetfulness

time

time

Forgetfulness

Δwij1=+1  

wij=0  

time

time

Forgetfulness

Δwij1=+1  

wij=0   wij=1  

time

time

Forgetfulness

Δwij1=+1  

wij=0  

wij=1  

wij=1  

wij=1  

pre post

Δw=+1

Δw=-1

A learning rule for binary synapses

+1 +1

-1 +1

Δw=-1

Δw=+1

+1 -1

-1 -1

with probability q

Initial signal (starting from equilibrium)

Initial signal (starting from equilibrium)

Probability that Δwij1=wij by chance

Initial signal (starting from equilibrium)

Probability that Δwij1=wij by chance

Probability that initially Δwij1≠wij (by chance, 1/2) and

the weight is flipped by learning (q)

Initial signal (starting from equilibrium)

Probability that Δwij1=wij by chance

Probability that initially Δwij1≠wij (by chance, 1/2) and

the weight is flipped by learning (q)

Signal after t memories

Probability that the synapse is NOT modified

Signal after t memories

Signal after t memories

Signal after t memories

Noise (at equilibrium)

Signal after p memories

Tsodyks 1988, Amit Fusi 1992, Amit Fusi 1994

Blackout Catastrophe

Bounded synapses Unbounded unrealistic synapses

Fast (large q)

Slow (small q)

SN

R

p

SNR(0) Number of memories

FAST

SLOW

Learning rate

~ 1

0

Multi-state synapses

2 states

Multi-state synapses

2 states

… m states

Fusi, Abbott, Nature Neurosci. 2007

SNR(0) Number of memories

FAST

SLOW

Learning rate

~ 1

0

MULTI-STATE SYNAPSE (BALANCED LTP/LTD)

MULTI-STATE SYNAPSE (IMBALANCED LTP/LTD)

Sparse  memories  

ξ=1 with probability f

ξ=0 with probability 1-f

pre post

Δw=+1 (LTP) with probability q+

Δw=-1 (LTD) with probability q-

Learning rule

Tsodyks Feigelman 1989, Amit Fusi 1994

if q+=q- then: strong imbalance

Sparse  memories  

ξ=1 with probability f

ξ=0 with probability 1-f

pre post

Δw=+1 (LTP) with probability q+

Δw=-1 (LTD) with probability q-

Learning rule

q+=q

q-= qf

Tsodyks Feigelman 1989, Amit Fusi 1994

SNR(0) Number of memories

SLOW (DENSE)

Learning rate SPARSE

Amit Fusi , Neural Computation, 1994

A  significant  improvement,  but…    1) Not  robust  to  noise  (ξ=0 must  be  exactly  0).  In  the  presence  of  noise:  p ~ Nsyn

2) The  amount  of  informa(on  per  memory  is  significantly  smaller  (it  scales  like  f)  3) Not  scalable  (for  large  Nsyn  it  is  very  difficult  to  readout  the  relevant  info)  

Ben Dayan Rubin, Fusi, Frontiers in Comp. Neuroscience 2007

Learning  systems  with  mul(ple  (mescales  

Fast

Slow

SN

R

p

m groups

FAST SLOW

q1 qm-1 q2 qm=qs

qk=(qs)(k-1)/(m-1)

Nsyn/m Nsyn/m Nsyn/m Nsyn/m

t=p (memories are stored at a constant rate)

t=p

t=p

t=p

t=p

~1/t

~e-qst

SNR

qs√N

√N log(1/qs)  

qs√N e-qst qs√N √m e-qst

SNR

SNR(0) Number of memories

FAST

SLOW

Learning rate

SNR

HETEROGENEOUS

Fusi, Abbott, Neuron 2005; Roxin Fusi, PLoS Comp Biol. 2013

SNR(0) Number of memories

FAST

SLOW

Learning rate

SNR

HETEROGENEOUS

Fusi, Abbott, Neuron 2005; Roxin Fusi, PLoS Comp Biol. 2013

Hom

ogenous q=q1

SN

R

(=p)

Heterogeneous systems with memory transfer

A. Roxin, S. Fusi, PLoS Comp. Biology 2013

BEFORE SURGERY

AFTER BEFORE SURGERY

AFTER

m groups

FAST SLOW

q1 qm-1 q2 qm=qs

qk=(qs)k/m

N/m N/m N/m N/m

FAST SLOW

q1 qm-1 q2 qm=qs

INPUT

Memory transfer

Foster, Wilson 2006

Foster, Wilson 2006

FAST SLOW

q1 qm-1 q2 qm=qs

INPUT

Memory transfer

FAST SLOW

q1 qm-1 q2 qm=qs

S1 S2 Sm-1 Sm

Readout: S = max {S1 ,…,Sm }

√N log(1/qs)  

qs√N

√N √m

~1/qs ~m/qs

~m1/4

(=p)

SNR(0) Number of memories

FAST

SLOW

Learning rate

HETEROGENEOUS

HETEROGENEOUS with MEMORY TRANSFER

m groups

FAST SLOW

q1 qm-1 q2 qm=qs

qk=(qs)k/m

N/m N/m N/m N/m

The  cascade  model  

1/ ~Signal Noise t N−

Fusi,  Drew,  AbboY,  Neuron  (2005)  Strong  Weak  

…  …  METAPLASTICITY  

PLASTICITY  

SNR(0) Number of memories Learning rate

HETEROGENEOUS

HETEROGENEOUS with MEMORY TRANSFER

VARIABLE (metaplasticity)

SNR(0) Number of memories Learning rate

HETEROGENEOUS

HETEROGENEOUS with MEMORY TRANSFER

VARIABLE (metaplasticity)

Tomorrow… p ~ Nsyn

Conclusions

Synapses that are bounded and can modified with limited precision require special machinery for preventing catastrophic forgetting Two important principles to improve performance: 1)  Heterogeneity (multiple timescales) 2)  Efficient memory transfer

People Theory of memory Daniel Amit Nicolas Brunel Francesco Battaglia Francesco Carusi Walter Senn Larry Abbott Daniel Ben Dayan Rubin Alex Roxin Srdjan Ostojic Marcus Benna