levels in computational neuroscience reasonably good understanding (for our purposes!) poor...

30
Levels in Computational Neuroscience Reasonably good Reasonably good understanding (for understanding (for our purposes!) our purposes!) Poor understanding Poor understanding Poorer Poorer understanding understanding Very poorer Very poorer understanding understanding

Post on 15-Jan-2016

219 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Levels in Computational Neuroscience Reasonably good understanding (for our purposes!) Poor understanding Poorer understanding Very poorer understanding

Levels in Computational Neuroscience

Reasonably good Reasonably good understanding (for our understanding (for our

purposes!) purposes!)

Poor understanding Poor understanding

Poorer understanding Poorer understanding

Very poorer Very poorer understanding understanding

Page 2: Levels in Computational Neuroscience Reasonably good understanding (for our purposes!) Poor understanding Poorer understanding Very poorer understanding

From neuron to network

Page 3: Levels in Computational Neuroscience Reasonably good understanding (for our purposes!) Poor understanding Poorer understanding Very poorer understanding

The layered structure of the first visual area, and connections to other areas (Fig. 27.10 in Kandel and Schwartz, Principles of Neural Science)

Page 4: Levels in Computational Neuroscience Reasonably good understanding (for our purposes!) Poor understanding Poorer understanding Very poorer understanding

The columnar organization of the monkey visual cortex (Fig. 12.6 in Shepherd, The Synaptic Organization of the Brain)

Page 5: Levels in Computational Neuroscience Reasonably good understanding (for our purposes!) Poor understanding Poorer understanding Very poorer understanding

Definition of the firing rate in terms of a temporal average. (Fig. 1.9, Spiking Neuron Models)

Page 6: Levels in Computational Neuroscience Reasonably good understanding (for our purposes!) Poor understanding Poorer understanding Very poorer understanding

Definition of the firing rate in terms of the peri-stimulus-time-histogram (PSTH) as an average over several runs of an experiment. (Fig. 1.10, Spiking Neuron Models)

Page 7: Levels in Computational Neuroscience Reasonably good understanding (for our purposes!) Poor understanding Poorer understanding Very poorer understanding

Definition of the firing rate as a population density.

Gerstner & Kistler Fig. 1.11

Page 8: Levels in Computational Neuroscience Reasonably good understanding (for our purposes!) Poor understanding Poorer understanding Very poorer understanding

Feedforward inputs to a single neuron.

Dayan and Abbott Fig. 7.81

Page 9: Levels in Computational Neuroscience Reasonably good understanding (for our purposes!) Poor understanding Poorer understanding Very poorer understanding

Feedforward and recurrent networks

Dayan and Abbott Fig. 7.3

Page 10: Levels in Computational Neuroscience Reasonably good understanding (for our purposes!) Poor understanding Poorer understanding Very poorer understanding

Dayan and Abbott Fig. 7.4

Coordinate transformations during a reaching taskTargetFixation

Gaze angle Retinal angle

Body coordinates

Objective: transform from retinal coordinates to body coordinates

Page 11: Levels in Computational Neuroscience Reasonably good understanding (for our purposes!) Poor understanding Poorer understanding Very poorer understanding

Tuning curves of a visually responsive neuron

in premotor cortex

Dayan and Abbott Fig. 7.5

Head fixed

Fixate on

• Body coordinates

• Response curve fixed!

• Retinal coordinates

• Curve shifts to compensate!

Head rotates

Fixation fixed

Model tuning curve

g=00g=100 g=-200

Page 12: Levels in Computational Neuroscience Reasonably good understanding (for our purposes!) Poor understanding Poorer understanding Very poorer understanding

Dayan and Abbott Fig. 7.6

The gaze-dependent gain modulation of visual responses of neurons in area 7a

Tuning curve

2 Gaze directions

Gaze independence!

Related to s

2D tuning function

Page 13: Levels in Computational Neuroscience Reasonably good understanding (for our purposes!) Poor understanding Poorer understanding Very poorer understanding

burst and an integrator neurons involved in horizontal eye positioning

Dayan and Abbott Fig. 7.7

Page 14: Levels in Computational Neuroscience Reasonably good understanding (for our purposes!) Poor understanding Poorer understanding Very poorer understanding

Eigenvector expansion

Page 15: Levels in Computational Neuroscience Reasonably good understanding (for our purposes!) Poor understanding Poorer understanding Very poorer understanding

Steady state rates – linear network

Real-valued matrix M: use real and imaginary parts

Page 16: Levels in Computational Neuroscience Reasonably good understanding (for our purposes!) Poor understanding Poorer understanding Very poorer understanding

Selective amplification by a linear network

Dayan and Abbott Fig. 7.8

Input: cosine with peak at = 0o + added noise

Fourier amplitude of inputs

Output: steady state

Fourier amplitude of output

= 0 component enhancedAll Fourier components present

Page 17: Levels in Computational Neuroscience Reasonably good understanding (for our purposes!) Poor understanding Poorer understanding Very poorer understanding

Effect of nonlinearity on amplification

Dayan and Abbott Fig. 7.8

Smoother response

Several Fourier components appear

Page 18: Levels in Computational Neuroscience Reasonably good understanding (for our purposes!) Poor understanding Poorer understanding Very poorer understanding

Visual information flow

Dayan and Abbott Fig.2.5

Center surround responseOriented response

Page 19: Levels in Computational Neuroscience Reasonably good understanding (for our purposes!) Poor understanding Poorer understanding Very poorer understanding

Visual receptive fields

Dayan and Abbott Fig. 2.25

Mathematical fit

Actual response

LGN neuron Center surround

Orientation selective

V1 neuron (simple)

Page 20: Levels in Computational Neuroscience Reasonably good understanding (for our purposes!) Poor understanding Poorer understanding Very poorer understanding

Hubel Wiesel model

Low response

Simple summation

Vertical response Undirected response

High response

Page 21: Levels in Computational Neuroscience Reasonably good understanding (for our purposes!) Poor understanding Poorer understanding Very poorer understanding

Effect of contrast

Dayan and Abbott Fig. 7.10

4 input contrast levels

Note: response is amplified but

Real responses Network amplification

not broadened

Page 22: Levels in Computational Neuroscience Reasonably good understanding (for our purposes!) Poor understanding Poorer understanding Very poorer understanding

Nonlinear winner-takes-all selection

Dayan and Abbott Fig. 7.12Dayan and Abbott Fig. 7.12

Input: cantered at ±900Output: Higher peak selected

Page 23: Levels in Computational Neuroscience Reasonably good understanding (for our purposes!) Poor understanding Poorer understanding Very poorer understanding

Associative recall

Dayan and Abbott Fig. 7.16

2 representative units Memory: units 18-31 high, others low

Memory: every 4th unit high

Nv=50, 4 patterns

Partial inputs Converged outputs

Page 24: Levels in Computational Neuroscience Reasonably good understanding (for our purposes!) Poor understanding Poorer understanding Very poorer understanding

Pattern recall – Hopfield model

Input Output

Time

Page 25: Levels in Computational Neuroscience Reasonably good understanding (for our purposes!) Poor understanding Poorer understanding Very poorer understanding

Dayan and Abbott Fig. 7.17

Excitatory-Inhibitory network Nullclines Eigenvalues

Unstable

Stable

Page 26: Levels in Computational Neuroscience Reasonably good understanding (for our purposes!) Poor understanding Poorer understanding Very poorer understanding

Dayan and Abbott Fig. 7.18

Excitatory-Inhibitory network Temporal behavior Stable fixed point

30msI

Page 27: Levels in Computational Neuroscience Reasonably good understanding (for our purposes!) Poor understanding Poorer understanding Very poorer understanding

Dayan and Abbott Fig. 7.19

Excitatory-Inhibitory network Temporal behavior Unstable fixed point –

limit cycle

50msI

Page 28: Levels in Computational Neuroscience Reasonably good understanding (for our purposes!) Poor understanding Poorer understanding Very poorer understanding

Dayan and Abbott Fig. 7.20

Extracellular field potential in olfactory bulb

Olfactory model I

To cortex

Excitatory

Inhibitory interneurons

Sniffs

Oscillatory neural activity

No fast oscillations

Page 29: Levels in Computational Neuroscience Reasonably good understanding (for our purposes!) Poor understanding Poorer understanding Very poorer understanding

Dayan and Abbott Fig. 7.16

Olfactory model II Activation functions Eigenvalues

Region of instability

Page 30: Levels in Computational Neuroscience Reasonably good understanding (for our purposes!) Poor understanding Poorer understanding Very poorer understanding

Dayan and Abbott Fig. 7.22

Olfactory model III

Behavior during a sniff cycle

Identity of odor determined by:

• Amplitudes and phases of oscillations

• Identity of participating mitral cells