pynn documentation · chapter 1 introduction pynn(pronounced ‘pine’) is a simulator-independent...

202
PyNN Documentation Release 0.9.3 the PyNN community Dec 04, 2018

Upload: others

Post on 23-Jul-2020

9 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN DocumentationRelease 0.9.3

the PyNN community

Dec 04, 2018

Page 2: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the
Page 3: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

Contents

1 Introduction 1

2 Installation 3

3 Quickstart 7

4 Building networks 9

5 Injecting current 25

6 Recording spikes and state variables 29

7 Data handling 31

8 Simulation control 39

9 Model parameters and initial values 43

10 Random numbers 47

11 Backends 51

12 Running parallel simulations 59

13 Units 61

14 Examples 63

15 Publications about, relating to or using PyNN 113

16 Contributors and licence 115

17 Release notes 123

18 Developers’ Guide 151

19 API reference 159

20 Old documents 187

i

Page 4: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

21 Indices and tables 191

Python Module Index 193

ii

Page 5: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

CHAPTER 1

Introduction

PyNN (pronounced ‘pine’) is a simulator-independent language for building neuronal network models.

In other words, you can write the code for a model once, using the PyNN API and the Python programming language,and then run it without modification on any simulator that PyNN supports (currently NEURON, NEST and Brian) aswell as on certain neuromorphic hardware systems.

The PyNN API aims to support modelling at a high-level of abstraction (populations of neurons, layers, columns andthe connections between them) while still allowing access to the details of individual neurons and synapses when re-quired. PyNN provides a library of standard neuron, synapse and synaptic plasticity models, which have been verifiedto work the same on the different supported simulators. PyNN also provides a set of commonly-used connectivityalgorithms (e.g. all-to-all, random, distance-dependent, small-world) but makes it easy to provide your own connec-tivity in a simulator-independent way, either using the Connection Set Algebra (Djurfeldt, 2012) or by writing yourown Python code.

Even if you don’t wish to run simulations on multiple simulators, you may benefit from writing your simulation codeusing PyNN’s powerful, high-level interface. In this case, you can use any neuron or synapse model supported by yoursimulator, and are not restricted to the standard models. PyNN transparently supports distributed simulations (usingMPI) where the underlying simulator does.

It is straightforward to port an existing model from a Python-supporting simulator to PyNN, since this can be doneincrementally, replacing one piece of simulator-specific code at a time with the PyNN equivalent, and testing that themodel behaviour is unchanged at each step.

Download the current stable release of the library (0.8.1) or get the development version from the Git repository .

1.1 Licence

The code is released under the CeCILL licence. (This is equivalent to and compatible with the GPL).

1

Page 6: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

1.2 Citing PyNN

If you publish work using or mentioning PyNN, we would appreciate it if you would cite the following paper:

Davison AP, Brüderle D, Eppler JM, Kremkow J, Muller E, Pecevski DA, Perrinet L and Yger P (2009) PyNN: acommon interface for neuronal network simulators. Front. Neuroinform. 2:11 doi:10.3389/neuro.11.011.2008.

1.3 Questions/Bugs/Enhancements

If you find a bug in PyNN, or wish to request a new feature, please go the PyNN issue tracker, click on “New Issue”,and fill in the form.

If you have questions or comments about PyNN, please post a message on the NeuralEnsemble Google group.

2 Chapter 1. Introduction

Page 7: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

CHAPTER 2

Installation

The following instructions are for Linux and Mac OS X. It should be possible to install and run PyNN on Windows,but this has not been tested.

Installing PyNN requires:

• Python (version 2.7, 3.3-3.6)

• a recent version of the NumPy package

• the lazyarray package

• the Neo package (>= 0.5.0)

• at least one of the supported simulators: e.g. NEURON, NEST, or Brian.

Optional dependencies are:

• mpi4py (if you wish to run distributed simulations using MPI)

• either Jinja2 or Cheetah (templating engines)

• the CSA library

2.1 Installing PyNN

Note: if using NEURON or NEST, it is easiest if you install NEURON/NEST before you install PyNN (see below).

The easiest way to get PyNN is to use pip:

$ pip install pyNN

If you would prefer to install manually, download the latest source distribution, then run the setup script, e.g.:

3

Page 8: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

$ tar xzf PyNN-0.9.3.tar.gz$ cd PyNN-0.9.3$ python setup.py install

This will install it to your Python site-packages directory, and may require root privileges. We strongly rec-ommend, however, that you use a virtualenv or a Conda environment. We assume you have already installed thesimulator(s) you wish to use it with. If this is not the case, see below for installation instructions.

Test it using something like the following:

>>> import pyNN.nest as sim>>> sim.setup()>>> sim.end()

(This assumes you have NEST installed).

Warning: If you get a warning “Unable to install NEST extensions. Certain models may not be available”then ensure the program nest-config is on your system PATH. If you still get this message even after addingthe directory containing nest-config to the PATH, try pip uninstall PyNN, then re-install with pipinstall --no-binary :all: PyNN

With NEURON as the simulator, make sure you install NEURON before you install PyNN. The PyNN installationwill then compile PyNN-specific membrane mechanisms, which are loaded when importing the neuron module:

>>> import pyNN.neuron as simNEURON -- Release 7.4 (1370:16a7055d4a86) 2015-11-09Duke, Yale, and the BlueBrain Project -- Copyright 1984-2015See http://www.neuron.yale.edu/neuron/credits

loading membrane mechanisms from /home/docker/dev/PyNN/pyNN/neuron/nmodl/x86_64/.libs/→˓libnrnmech.soAdditional mechanisms from filesadexp.mod alphaisyn.mod alphasyn.mod expisyn.mod gap.mod gsfa_grr.mod hh_traub.modizhikevich.mod netstim2.mod refrac.mod reset.mod stdwa_guetig.mod stdwa_softlimits.→˓modstdwa_songabbott.mod stdwa_symm.mod stdwa_vogels2011.mod tmgsyn.mod tmisyn.modtsodyksmarkram.mod vecstim.mod

If you installed PyNN before installing NEURON, or if you update your PyNN installation, you will need to manuallyrun nrnivmodl in the pyNN/neuron/nmodl directory.

2.2 Installing NEURON

Download the sources for NEURON 7.4 or 7.5, in .tar.gz format, from http://www.neuron.yale.edu/neuron/download/getstd. Also download Interviews from the same location.

Compile Interviews and NEURON according to the instructions given at http://www.neuron.yale.edu/neuron/static/download/compilestd_unix.html, except that when you run configure, add the options --with-nrnpythonand, optionally, --with-paranrn, i.e.:

$ ./configure --prefix=`pwd` --with-nrnpython --with-paranrn$ make$ make install

4 Chapter 2. Installation

Page 9: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

Make sure that you add the Interviews and NEURON bin directories to your path. Test that the Python support hasbeen enabled by running:

$ nrniv -pythonNEURON -- Release 7.4 (1370:16a7055d4a86) 2015-11-09Duke, Yale, and the BlueBrain Project -- Copyright 1984-2015See http://www.neuron.yale.edu/neuron/credits

>>> import hoc>>> import nrn

Now you can compile and install NEURON as a Python package:

$ cd src/nrnpython$ python setup.py install

Now test everything worked:

$ python>>> import neuronNEURON -- Release 7.4 (1370:16a7055d4a86) 2015-11-09Duke, Yale, and the BlueBrain Project -- Copyright 1984-2015See http://www.neuron.yale.edu/neuron/credits

If you run into problems, check out the NEURON Forum.

2.3 Installing NEST and PyNEST

NEST 2.14 can be downloaded from http://www.nest-simulator.org/download/. Earlier versions of NEST may notwork with this version of PyNN. The full installation instructions are available in the file INSTALL, which you canfind in the NEST source package, or at http://www.nest-simulator.org/installation/.

Now try it out:

$ cd ~$ python>>> import nest

-- N E S T --

Copyright (C) 2004 The NEST InitiativeVersion 2.14.0 Nov 21 2017 11:05:28

...>>> nest.Models()(u'ac_generator', u'aeif_cond_alpha', u'aeif_cond_alpha_RK5', u'aeif_cond_alpha_→˓multisynapse',...

Check that 'aeif_cond_alpha' is in the list of models. If it is not, you may need to install a newer version of theGNU Scientific Library and then recompile NEST.

2.4 Installing Brian

Instructions for downloading and installing Brian are available from http://briansimulator.org/download/. Note thatthis version of PyNN works with Brian 1.4, but not with Brian 2.

2.3. Installing NEST and PyNEST 5

Page 10: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

6 Chapter 2. Installation

Page 11: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

CHAPTER 3

Quickstart

to write. . .

7

Page 12: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

8 Chapter 3. Quickstart

Page 13: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

CHAPTER 4

Building networks

4.1 Building networks: neurons

4.1.1 Cell types

In PyNN, the system of equations that defines a neuronal model is encapsulated in a CellType class. PyNN providesa library of “standard” cell types (see Standard models) which work the same across all backend simulators - anexample is the IF_cond_exp model - an integrate-and-fire (I&F) neuron with conductance-based, exponential-decay synapses. For any given simulator, it is also possible to wrap a native model - a NEST or NEURON model, forexample - in a CellType class so that it works in PyNN (see documentation for individual Backends).

It should be noted that these “cell types” are mathematical cell types. Two or more different biological cell typesmay be represented by the same mathematical cell type, but with different parameterizations. For example, in thethalamocortical model of Destexhe (2009), thalamocortical relay neurons and cortical neurons are both modelled withthe adaptive exponential I&F neuron model (AdExp):

>>> refractory_period = RandomDistribution('uniform', [2.0, 3.0],→˓rng=NumpyRNG(seed=4242))>>> ctx_parameters = {... 'cm': 0.25, 'tau_m': 20.0, 'v_rest': -60, 'v_thresh': -50, 'tau_refrac':→˓refractory_period,... 'v_reset': -60, 'v_spike': -50.0, 'a': 1.0, 'b': 0.005, 'tau_w': 600, 'delta_T→˓': 2.5,... 'tau_syn_E': 5.0, 'e_rev_E': 0.0, 'tau_syn_I': 10.0, 'e_rev_I': -80 }>>> tc_parameters = ctx_parameters.copy()>>> tc_parameters.update({'a': 20.0, 'b': 0.0})

>>> thalamocortical_type = EIF_cond_exp_isfa_ista(**tc_parameters)>>> cortical_type = EIF_cond_exp_isfa_ista(**ctx_parameters)

(see Model parameters and initial values for more on specifying parameter values). To see the list of parameter namesfor a given cell type, use the get_parameter_names() method:

9

Page 14: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

>>> IF_cond_exp.get_parameter_names()['tau_refrac', 'cm', 'tau_syn_E', 'v_rest', 'tau_syn_I', 'tau_m', 'e_rev_E', 'i_offset→˓', 'e_rev_I', 'v_thresh', 'v_reset']

while the default values for the parameters are in the default_parameters attribute:

>>> print(IF_cond_exp.default_parameters){'tau_refrac': 0.1, 'cm': 1.0, 'tau_syn_E': 5.0, 'v_rest': -65.0, 'tau_syn_I': 5.0,→˓'tau_m': 20.0, 'e_rev_E': 0.0, 'i_offset': 0.0, 'e_rev_I': -70.0, 'v_thresh': -50.0,→˓ 'v_reset': -65.0}

Note that what we have created here are neuron type objects. These can be regarded as templates, from which we willconstruct the actual neurons in our network.

4.1.2 Populations

Since PyNN is designed for modelling networks containing many neurons, the default level of abstraction in PyNN isnot the single neuron but a population of neurons of a given type, represented by the Population class:

>>> tc_cells = Population(100, thalamocortical_type)>>> ctx_cells = Population(500, cortical_type)

To create a Population, we need to specify at minimum the number of neurons and the cell type. Three additionalarguments may optionally be specified:

• the spatial structure of the population;

• initial values for the neuron state variables;

• a label.

>>> from pyNN.space import Grid2D, RandomStructure, Sphere>>> tc_cells = Population(100, thalamocortical_type,... structure=RandomStructure(boundary=Sphere(radius=200.0)),... initial_values={'v': -70.0},... label="Thalamocortical neurons")>>> from pyNN.random import RandomDistribution>>> v_init = RandomDistribution('uniform', (-70.0, -60.0))>>> ctx_cells = Population(500, cortical_type,... structure=Grid2D(dx=10.0, dy=10.0),... initial_values={'v': v_init},... label="Cortical neurons")

(see Representing spatial structure and calculating distances for more detail on spatial structure and Model parametersand initial values for more on specifying initial values.)

For backwards compatibility and for ease of transitioning from other simulator languages, the create() function isavailable as an alias for Population. The following two lines are equivalent:

>>> cells = create(my_cell_type, n=100)>>> cells = Population(100, my_cell_type)

(Note the different argument order).

10 Chapter 4. Building networks

Page 15: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

4.1.3 Views

It is common to work with only a subset of the neurons in a Population - to modify their parameters, makeconnections or record from them. Any subset of neurons in a population may be addressed using the usual Pythonindexing and slicing notation, for example:

>>> id = ctx_cells[47] # the 48th neuron in a Population>>> view = ctx_cells[:80] # the first eighty neurons>>> view = ctx_cells[::2] # every second neuron>>> view = ctx_cells[45, 91, 7] # a specific set of neurons

It is also possible to address a random sample of neurons within a population using the sample() method:

>>> view = ctx_cells.sample(50, rng=NumpyRNG(seed=6538)) # select 50 neurons at→˓random

In the first of these examples, the object that is returned is an ID object, representing a single neuron. ID objects arediscussed below.

In all of these examples except the first, the object that is returned is a PopulationView. A PopulationViewholds references to a subset of neurons in a Population, which means that any changes in the view are also reflectedin the real population (and vice versa).

PopulationView objects behave in most ways as real Population objects; notably, they can be used in aProjection (see Building networks: connections) and combined with other Population or PopulationViewobjects to create an Assembly.

The parent attribute of a PopulationView has a reference to the Population that is being viewed, and themask attribute contains the indices of the neurons that are in the view.

>>> view.parent.label'Cortical neurons'>>> view.maskarray([150, 181, 53, 149, 496, 499, 240, 444, 13, 100, 28, 19, 101,

122, 143, 486, 467, 492, 406, 90, 136, 173, 8, 341, 5, 348,188, 63, 129, 416, 307, 298, 60, 180, 382, 47, 484, 370, 223,147, 72, 32, 261, 193, 249, 212, 58, 87, 86, 456])

4.1.4 Assemblies

As discussed above, a Population is a homogeneous collection of neurons, in the sense that all neurons have thesame cell type. An Assembly is an aggregate of Population and PopulationView objects, and as such canrepresent a heterogeneous collection of neurons, of multiple cell types.

An Assembly can be created by adding together Population and PopulationView objects:

>>> all_cells = tc_cells + ctx_cells>>> cells_for_plotting = tc_cells[:10] + ctx_cells[:50]

or by using the Assembly constructor:

>>> all_cells = Assembly(tc_cells, ctx_cells)

An assembly behaves in most ways like a Population, e.g. for setting and retrieving parameters, specifying whichneurons to record from, etc. It can also be specified as the source or target of a Projection. In this case, all theneurons in the component populations are treated as identical for the purposes of the connection algorithm (note that

4.1. Building networks: neurons 11

Page 16: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

if the post-synaptic receptor type is specified (with the receptor_type argument), an Exception will be raised ifnot all component neuron types possess that receptor type).

Individual populations within an Assembly may be accessed via their labels, e.g.:

>>> all_cells.get_population("Thalamocortical neurons")Population(100, EIF_cond_exp_isfa_ista(<parameters>),→˓structure=RandomStructure(origin=(0.0, 0.0, 0.0), boundary=Sphere(radius=200.0),→˓rng=NumpyRNG(seed=None)), label='Thalamocortical neurons')

Iterating over an assembly returns individual IDs, ordered by population. Similarly, the size attribute of anAssembly gives the total number of neurons it contains. To iterate over or count populations, use the populationsattribute:

>>> for p in all_cells.populations:... print("%-23s %4d %s" % (p.label, p.size, p.celltype.__class__.__name__))Thalamocortical neurons 100 EIF_cond_exp_isfa_istaCortical neurons 500 EIF_cond_exp_isfa_ista

4.1.5 Inspecting and modifying parameter values and initial conditions

Although both parameter values and initial conditions may be specified when creating a Population (and this isgenerally the most efficient place to do it), it is also possible to modify them later.

The get() method of Population, PopulationView and Assembly returns the current value(s) of one ormore parameters:

>>> ctx_cells.get('tau_m')20.0>>> all_cells[0:10].get('v_reset')-60.0

If the parameter is homogeneous across the group, a single number will be returned, otherwise get() will return aNumPy array containing the parameter values for all neurons:

>>> ctx_cells.get('tau_refrac')array([ 2.64655001, 2.15914942, 2.53500179, ...

It is also possible to ask for multiple parameters at once, in which case a list of values in the same order as the list ofparameter names will be returned.

>>> ctx_cells.get(['tau_m', 'cm'])[20.0, 0.25]

When running a distributed simulation using MPI, get() will by default return values for only those neurons thatexist on the current MPI node. To get the values for all neurons, use get(parameter_name, gather=True).

To modify parameter values, use the set() method. To set the same value for all neurons, pass a single number asthe parameter value:

>>> ctx_cells.set(a=2.0, b=0.2)

To set different values for different neurons there are several options - see Model parameters and initial values formore details.

To modify the initial values of model variables, use the initialize() method:

12 Chapter 4. Building networks

Page 17: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

>>> ctx_cells.initialize(v=RandomDistribution('normal', (-65.0, 2.0)),... w=0.0)

The default initial values may be inspected as follows:

>>> ctx_cells.celltype.default_initial_values{'gsyn_exc': 0.0, 'gsyn_inh': 0.0, 'w': 0.0, 'v': -70.6}

4.1.6 Injecting current into neurons

Static or time-varying currents may be injected into neurons using either the inject_into() method of theCurrentSource:

>>> pulse = DCSource(amplitude=0.5, start=20.0, stop=80.0)>>> pulse.inject_into(tc_cells)

or the inject() method of the Population, PopulationView or Assembly:

>>> import numpy>>> times = numpy.arange(0.0, 100.0, 1.0)>>> amplitudes = 0.1*numpy.sin(times*numpy.pi/100.0)>>> sine_wave = StepCurrentSource(times=times, amplitudes=amplitudes)>>> ctx_cells[80:90].inject(sine_wave)

See Injecting current for more about injecting currents.

4.1.7 Recording variables and retrieving recorded data

Just as each cell type has a well-defined set of parameters (whose values are constant over time), so it has a well-defined set of state variables, such as the membrane potential, whose values change over the course of a simulation.The recordable attribute of a CellType class contains a list of these variables, as well as the ‘spikes’ variable,which is used to record the times of action potentials:

>>> ctx_cells.celltype.recordable['spikes', 'v', 'w', 'gsyn_exc', 'gsyn_inh']

The record() method specifies which variables should be recorded:

>>> all_cells.record('spikes')>>> ctx_cells.sample(10).record(('v', 'w')) #, sampling_interval=0.2)

Note that the sampling interval must be an integer multiple of the simulation time step (except for simulators whichallow use of variable time-step integration methods).

Todo: discuss specifying filename in record()

At the end of a simulation, the recorded data can be retrieved using the get_data() method:

>>> t = run(0.2)>>> data_block = all_cells.get_data()

or written to file using write_data():

4.1. Building networks: neurons 13

Page 18: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

>>> from neo.io import NeoHdf5IO>>> h5file = NeoHdf5IO("my_data.h5")>>> ctx_cells.write_data(h5file)>>> h5file.close()

get_data() returns a Neo Block object. For more information on Neo see the documentation at http://packages.python.org/neo. Here, it will suffice to note that a Block is the top-level container, and contains one or moreSegments. Each Segment is a container for data that share a common time basis, and can contain lists ofAnalogSignal and SpikeTrain objects. These data objects inherit from NumPy array, and so can be treatedin further processing (analysis, visualization, etc.) in exactly the same way as plain arrays, but in addition they carrymetadata about units, sampling interval, etc.

write_data() also makes use of Neo, and allows writing to any of the several output file formats supported by Neo.Note that as a short-cut, you can just give a filename to write_data(); the output format will then be determinedbased on the filename extension (‘.h5’ for HDF5, ‘.txt’ for ASCII, etc.) if possible, otherwise the default file format(determined by the value of pyNN.recording.DEFAULT_FILE_FORMAT) will be used.

For more details, see Data handling.

4.1.8 Working with individual neurons

Although it is usually more convenient and more efficient to work with populations of neurons, it is occasionallyconvienient to work with individual neurons, represented as an ID object:

>>> tc_cells[47]709

For the simulator backends shipped with PyNN, the ID class is a subclass of int, and in the case of NEURON andNEST matches the global ID (gid) used internally by the simulator. There is no requirement that IDs be integers,however, nor that they have the same value across different simulators.

The parent attribute contains a reference to the parent population:

>>> a_cell = tc_cells[47]>>> a_cell.parent.label'Thalamocortical neurons'

To recover the index of a neuron within its parent given the ID, use Population.id_to_index(), e.g.:

>>> tc_cells.id_to_index(a_cell)47

The ID object allows direct access to the parameters of individual neurons, e.g.:

>>> a_cell.tau_m20.0

To change several parameters at once for a single neuron, use the set_parameters() method:

>>> a_cell.set_parameters(tau_m=10.0, cm=0.5)>>> a_cell.tau_m10.0>>> a_cell.cm0.5

14 Chapter 4. Building networks

Page 19: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

4.2 Building networks: connections

Conceptually, a synapse consists of a pre-synaptic structure, the synaptic cleft, and a post-synaptic structure. InPyNN, the temporal dynamics of the post-synaptic response are handled by the post-synaptic neuron model (see Celltypes). The size of the post-synaptic response (the “synaptic weight”), the temporal dynamics of the weight (synapticplasticity) and the connection delay are handled by synapse models.

At the time of writing, most neuronal network models do not explicitly model the axon. Rather, the time for propaga-tion of the action potential from soma/initial segment to axon terminal is added to the synaptic transmission time togive a composite delay, referred to as “synaptic delay” in this documentation. For point neuron models, which do notinclude an explicit model of the dendrite, the time for transmission of the post-synaptic potential to the soma may alsobe considered as being included in the composite synaptic delay.

At a minimum, therefore, a synaptic connection in PyNN has two attributes: “weight” and “delay”, which are in-terpreted as described above. Where the weight has its own dynamics, a connection may have more attributes: theplasticity model and its parameters.

Note: Currently, PyNN supports only chemical synapses, not electrical synapses. If the underlying simulator supportselectrical synapses, it is still possible to use them in a PyNN model, but this will not be simulator-independent.

Note: Currently, PyNN does not support stochastic synapses. If you would like to have support for this, or any otherfeature, please make a feature request.

4.2.1 Synapse types

Analogously to neuron models, the system of equations that defines a synapse model is encapsulated in aSynapseType class. PyNN provides a library of “standard” synapse types (see Standard models) which workthe same across all backend simulators.

Fixed synaptic weight

The simplest, and default synapse type in PyNN has constant synaptic weight:

syn = StaticSynapse(weight=0.04, delay=0.5)

Note: weights are in microsiemens or nanoamps, depending on whether the post-synaptic mechanism implements achange in conductance or current, and delays are in milliseconds (see Units).

It is also possible to add variability to synaptic weights and delays by specifying a RandomDistribution objectas the parameter value:

w = RandomDistribution('gamma', [10, 0.004], rng=NumpyRNG(seed=4242))syn = StaticSynapse(weight=w, delay=0.5)

It is also possible to specify parameters as a function of the distance (typically in microns, but different scales arepossible - see Representing spatial structure and calculating distances) between pre- and post-synaptic neurons:

syn = StaticSynapse(weight=w, delay="0.2 + 0.01*d")

4.2. Building networks: connections 15

Page 20: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

Short-term synaptic plasticity

PyNN currently provides one standard model for short-term synaptic plasticity (facilitation and depression):

depressing_synapse = TsodyksMarkramSynapse(weight=w, delay=0.2, U=0.5,tau_rec=800.0, tau_facil=0.0)

tau_rec = RandomDistribution('normal', [100.0, 10.0])facilitating_synapse = TsodyksMarkramSynapse(weight=w, delay=0.5, U=0.04,

tau_rec=tau_rec)

Spike-timing-dependent plasticity

STDP models are specified in a slightly different way than other standard models: an STDP synapse type is constructedfrom separate weight-dependence and timing-dependence components, e.g.:

stdp = STDPMechanism(weight=0.02, # this is the initial value of the weightdelay="0.2 + 0.01*d",timing_dependence=SpikePairRule(tau_plus=20.0, tau_minus=20.0,

A_plus=0.01, A_minus=0.012),weight_dependence=AdditiveWeightDependence(w_min=0, w_max=0.04))

Note that not all simulators will support all possible combinations of synaptic plasticity components.

4.2.2 Connection algorithms

In PyNN, each different algorithm that can be used to determine which pre-synaptic neurons are connected to whichpost-synaptic neurons (also called a “connection method” or “wiring method”) is encapsulated in a separate class.

Note: for those interested in design patterns, this is an example of the Strategy Pattern

Each such class inherits from a base class, Connector, and must implement a connect() method which takes aProjection object (see below) as its single argument.

PyNN’s library of connection algorithms currently contains the following classes:

All-to-all connections

Each neuron in the pre-synaptic population is connected to every neuron in the post-synaptic population. (In thissection, the term “population” should be understood as referring to any of the following: a Population, aPopulationView, or an Assembly object.)

The AllToAllConnector constructor has one optional argument, allow_self_connections, for use whenconnecting a population to itself. By default it is True, but if a neuron should not connect to itself, set it to False,e.g.:

connector = AllToAllConnector(allow_self_connections=False) # no autapses

16 Chapter 4. Building networks

Page 21: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

One-to-one connections

Use of the OneToOneConnector requires that the pre- and post-synaptic populations have the same size. Theneuron with index i in the pre-synaptic population is then connected to the neuron with index i in the post-synapticpopulation.

connector = OneToOneConnector()

Trying to connect two populations with different sizes will raise an Exception.

Connecting neurons with a fixed probability

With the FixedProbabilityConnector method, each possible connection between all pre-synaptic neuronsand all post-synaptic neurons is created with probability p_connect:

connector = FixedProbabilityConnector(p_connect=0.2)

Connecting neurons with a position-dependent probability

The connection probability can also depend on the positions of the pre- and post-synaptic neurons.

With the DistanceDependentProbabilityConnector, the connection probability depends on the distancebetween the two neurons.

The constructor requires a string d_expression, which should be a distance expression, as described above fordelays, but returning a probability (a value between 0 and 1):

DDPC = DistanceDependentProbabilityConnectorconnector = DDPC("exp(-d)")connector = DDPC("d<3")

The first example connects neurons with an exponentially-decaying probability. The second example connects eachneuron to all its neighbours within a range of 3 units (typically interpreted as µm, but this is up to the individual user).Note that boolean values True and False are automatically converted to numerical values 1.0 and 0.0.

Calculation of distance may be controlled by specifying a Space object, passed to the Projection constructor (seebelow).

For a more general dependence of connection probability on position, use theIndexBasedProbabilityConnector, which expects a function of the indices, i and j, of the pre- andpost-synaptic neurons. The function should return the probability of creating that connection.

Divergent/fan-out connections

The FixedNumberPostConnector connects each pre-synaptic neuron to exactly n post-synaptic neurons chosenat random:

connector = FixedNumberPostConnector(n=30)

If n is less than the size of the post-synaptic population, there are no multiple connections, i.e., no instances of thesame pair of neurons being multiply connected. If n is greater than the size of the pre-synaptic population, all possiblesingle connections are made before starting to add duplicate connections.

The number of post-synaptic neurons n can be fixed, or can be chosen at random from a RandomDistributionobject, e.g.:

4.2. Building networks: connections 17

Page 22: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

distr_npost = RandomDistribution(distribution='binomial', n=100, p=0.3)connector = FixedNumberPostConnector(n=distr_npost)

Convergent/fan-in connections

The FixedNumberPreConnector has the same arguments as FixedNumberPostConnector, but of courseit connects each post-synaptic neuron to n pre-synaptic neurons, e.g.:

connector = FixedNumberPreConnector(5)distr_npre = RandomDistribution(distribution='poisson', lambda_=5)connector = FixedNumberPreConnector(distr_npre)

Creating a small-world network

Todo: Pierre to write this bit?

Using the Connection Set Algebra

The Connection Set Algebra (Djurfeldt, 2012) is a sophisticated system that allows elaborate connectivity patterns to beconstructed using a concise syntax. Using the CSA requires the Python csa module to be installed (see Installation).

The details of constructing a connection set are beyond the scope of this manual. We give here a simple example.

import csacset = csa.full - csa.oneToOneconnector = CSAConnector(cset)

csa.full represents all-to-all connections, while csa.oneToOne represents the connection of pre-synaptic neu-ron i to post-synaptic neuron i. By subtracting the second from the first, the connection rule is “all-to-all, except wherethe neurons have the same index”. If the pre- and post-synaptic populations are the same population, this is equivalentto AllToAllConnector(allow_self_connections=False).

Todo: explain that weights and delays can either be specified within the connection set or within the synapse type.

Specifying a list of connections

Specific connection patterns not covered by the methods above can be obtained by specifying an explicit list of pre-synaptic and post-synaptic neuron indices. Optionally, the list can contain synaptic properties such as weights, delays,or the parameters for plasticity rules. Example:

connections = [(0, 0, 0.0, 0.1),(0, 1, 0.0, 0.1),(0, 2, 0.0, 0.1),(1, 5, 0.0, 0.1)

]connector = FromListConnector(connections, column_names=["weight", "delay"])

18 Chapter 4. Building networks

Page 23: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

Any synaptic parameters not given in the list are determined from the synapse type. Parameters given in the list alwaysoverride the values from the synapse type.

Reading connection patterns to/from a file

Connection patterns can be read in from a text file. The file should contain a header specifying which parameter is inwhich column, e.g.:

# columns = ["i", "j", "weight", "delay", "U", "tau_rec"]

and then the connection data should be in columns separated by spaces. The connections are read using:

connector = FromFileConnector("connections.txt")

Specifying an explicit connection matrix

The connectivity can be specified as a boolean array, where each row represents the existence of connections from agiven pre-synaptic neuron to the post-synaptic neurons. For example:

connections = numpy.array([[0, 1, 1, 0],[1, 1, 0, 1],[0, 0, 1, 0]],

dtype=bool)connector = ArrayConnector(connections)

User-defined connection algorithms

If you wish to use a specific connection/wiring algorithm not covered by the PyNN built-in ones, the options include:

• constructing a list or array of connections and using the FromListConnector or ArrayConnector class;

• using the Connection Set Algebra and the CSAConnector class;

• writing your own Connector class - see the Developers’ guide for guidance on this.

4.2.3 Projections

A Projection is a container for a set of connections between two populations of neurons, where by population wemean one of:

• a Population object - a group of neurons all of the same type;

• a PopulationView object - part of a Population;

• a Assembly - a heterogeneous group of neurons, which may be of different types.

Creating a Projection in PyNN also creates the connections at the level of the simulator. To create a Projectionwe must specify:

• the pre-synaptic population;

• the post-synaptic population;

• a connection/wiring method;

• a synapse type

4.2. Building networks: connections 19

Page 24: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

Optionally, we can also specify:

• the name of the post-synaptic mechanism (e.g. ‘excitatory’, ‘NMDA’) (by default, this is ‘excitatory’);

• a label (autogenerated if not specified);

• a Space object, which determines how distances should be calculated for distance-dependent wiring schemesor parameter values.

Here is a minimal example:

excitatory_connections = Projection(pre, post, AllToAllConnector(),StaticSynapse(weight=0.123))

and here is a full example:

rng = NumpyRNG(seed=64754)sparse_connectivity = FixedProbabilityConnector(0.1, rng=rng)weight_distr = RandomDistribution('normal', [0.01, 1e-3], rng=rng)facilitating = TsodyksMarkramSynapse(U=0.04, tau_rec=100.0, tau_facil=1000.0,

weight=weight_distr, delay=lambda d: 0.1+d/100.0)space = Space(axes='xy')inhibitory_connections = Projection(pre, post,

connector=sparse_connectivity,synapse_type=facilitating,receptor_type='inhibitory',space=space,label="inhibitory connections")

Note that the attribute receptor_types of all cell type classes contains a list of the possible values ofreceptor_type for that cell type:

>>> postPopulation(10, IF_cond_exp(<parameters>), structure=Line(y=0.0, x0=0.0, z=0.0, dx=1.→˓0), label='population1')>>> post.celltypeIF_cond_exp(<parameters>)>>> post.celltype.receptor_types('excitatory', 'inhibitory')

The space argument is used to specify how to calculate distances, since we have used a distance expression to specifythe connection delay, modelling a constant axonal propagation speed.

By default, the 3D distance between cell positions is used, but the axes argument may be used to change this, i.e.:

space = Space(axes='xy')

will ignore the z-coordinate when calculating distance. Similarly, the origins of the coordinate systems of thetwo populations and the relative scale of the two coordinate systems may be controlled using the offset andscale_factor arguments to the Space constructor. This is useful when connecting brain regions that have verydifferent sizes but that have a topographic mapping between them, e.g. retina to LGN to V1.

In more abstract models, it is often useful to be able to avoid edge effects by specifying periodic boundary conditions,e.g.:

space = Space(periodic_boundaries=((0,500), (0,500), None))

calculates distance on the surface of a torus of circumference 500 µm (wrap-around in the x- and y-dimensions but notz). For more information, see Representing spatial structure and calculating distances.

20 Chapter 4. Building networks

Page 25: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

Accessing weights and delays

The Projection.get() method allows the retrieval of connection attributes, such as weights and delays. Twoformats are available. 'list' returns a list of length equal to the number of connections in the projection, 'array'returns a 2D weight array (with NaN for non-existent connections):

>>> excitatory_connections.get('weight', format='list')[3:7][(3, 0, 0.123), (4, 0, 0.123), (5, 0, 0.123), (6, 0, 0.123)]>>> inhibitory_connections.get('delay', format='array')[:3,:5]array([[ nan, nan, nan, nan, 0.14],

[ nan, nan, nan, 0.12, 0.13],[ 0.12, nan, nan, nan, nan]])

To suppress the coordinates of the connection in 'list', view, set the with_address option to False:

>>> excitatory_connections.get('weight', format='list', with_address=False)[3:7][0.123, 0.123, 0.123, 0.123]

As well as weight and delay, Projection.get() can also retrieve any other parameters of synapse models:

>>> inhibitory_connections.get('U', format='list')[0:4][(2, 0, 0.04), (6, 1, 0.04), (8, 1, 0.04), (9, 2, 0.04)]

It is also possible to retrieve the values of multiple attributes at once, as either a list of tuples or a tuple of arrays:

>>> connection_data = inhibitory_connections.get(['weight', 'delay'], format='list')>>> for connection in connection_data[:5]:... src, tgt, w, d = connection... print("weight = %.4f delay = %4.2f" % (w, d))weight = 0.0094 delay = 0.12weight = 0.0113 delay = 0.15weight = 0.0102 delay = 0.17weight = 0.0097 delay = 0.17weight = 0.0127 delay = 0.12>>> weights, delays = inhibitory_connections.get(['weight', 'delay'], format='array')>>> exists = ~numpy.isnan(weights)>>> for w, d in zip(weights[exists].flat, delays[exists].flat)[:5]:... print("weight = %.4f delay = %4.2f" % (w, d))weight = 0.0097 delay = 0.14weight = 0.0127 delay = 0.12weight = 0.0097 delay = 0.13weight = 0.0094 delay = 0.18weight = 0.0094 delay = 0.12

Note that in this last example we have filtered out the non-existent connections using numpy.isnan().

The Projection.save() method saves connection attributes to disk.

Todo: finish documenting save() method (also decide if it should be write() or save()) need to think about formats.Text, HDF5, . . .

Access to the weights and delays of individual connections is by the connections attribute, e.g.:

>>> list(inhibitory_connections.connections)[0].weight0.0094460775218037779>>> list(inhibitory_connections.connections)[10].weight0.0086313719119562281

4.2. Building networks: connections 21

Page 26: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

Modifying weights and delays

As noted above, weights, delays and other connection attributes can be specified on creation of a Projection, andthis is generally the most efficient time to specify them. It is also possible, however, to modify these attributes aftercreation, using the set() method.

set() accepts any number of keyword arguments, where the key is the attribute name, and the value is either:

• a numeric value (all connections will be set to the same value);

• a RandomDistribution object (each connection will be set to a different value, drawn from the distribu-tion);

• a list or NumPy array of the same length as the number of connections in the Projection;

• a generator;

• a string expressing a function of the distance between pre- and post-synaptic neurons.

Todo: clarify whether this is the number of local connections or the total number of connections.

Some examples:

>>> excitatory_connections.set(weight=0.02)>>> excitatory_connections.set(weight=RandomDistribution('gamma', [1, 0.1]),... delay=0.3)>>> inhibitory_connections.set(U=numpy.linspace(0.4, 0.6, len(inhibitory_→˓connections)),... tau_rec=500.0,... tau_facil=0.1)

It is also possible to access the attributes of individual connections using the connections attribute of aProjection:

>>> for c in list(inhibitory_connections.connections)[:5]:... c.weight *= 2

although this is almost always less efficient than using list- or array-based access.

4.3 Representing spatial structure and calculating distances

The space module contains classes for specifying the locations of neurons in space and for calculating the distancesbetween them.

Neuron positions can be defined either manually, using the positions attribute of a Population or using aStructure instance which is passed to the Population constructor.

A number of different structures are available in space. It is simple to define your own Structure sub-class if youneed something that is not already provided.

The simplest structure is a grid, whether 1D, 2D or 3D, e.g.:

>>> from pyNN.space import *>>> line = Line(dx=100.0, x0=0.0, y=200.0, z=500.0)>>> line.generate_positions(7)array([[ 0., 100., 200., 300., 400., 500., 600.],

[ 200., 200., 200., 200., 200., 200., 200.],

(continues on next page)

22 Chapter 4. Building networks

Page 27: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

(continued from previous page)

[ 500., 500., 500., 500., 500., 500., 500.]])>>> grid = Grid2D(aspect_ratio=3, dx=10.0, dy=25.0, z=-3.0)>>> grid.generate_positions(3)array([[ 0., 10., 20.],

[ 0., 0., 0.],[ -3., -3., -3.]])

>>> grid.generate_positions(12)array([[ 0., 0., 10., 10., 20., 20., 30., 30., 40., 40., 50., 50.],

[ 0., 25., 0., 25., 0., 25., 0., 25., 0., 25., 0., 25.],[ -3., -3., -3., -3., -3., -3., -3., -3., -3., -3., -3., -3.]])

Here we have specified an x:y ratio of 3, so if we ask the grid to generate positions for 3 neurons, we get a 3x1 grid,12 neurons a 6x2 grid, 27 neurons 9x3, etc.

BY default, grid positions are filled sequentially, iterating first over the z dimension, then y, then x, but we can also fillthe grid randomly:

>>> rgrid = Grid2D(aspect_ratio=1, dx=10.0, dy=10.0, fill_order='random',→˓rng=NumpyRNG(seed=13886))>>> rgrid.generate_positions(9)array([[ 10., 10., 10., 0., 0., 0., 20., 20., 20.],

[ 0., 10., 20., 10., 20., 0., 0., 10., 20.],[ 0., 0., 0., 0., 0., 0., 0., 0., 0.]])

The space module also provides the RandomStructure class, which distributes neurons randomly and uniformlywithin a given volume:

>>> glomerulus = RandomStructure(boundary=Sphere(radius=200.0),→˓rng=NumpyRNG(seed=34534))>>> glomerulus.generate_positions(5)array([[ -19.78455022, 33.21412264, -79.4314059 , 143.39033263, -63.18242977],

[ 56.17281502, -23.15159309, 131.89071845, -73.73583484, -8.86422999],[ -78.88348228, -3.97408513, -95.03056844, 45.13969087, -111.67070498]])

The volume classes currently available are Sphere and Cuboid.

Defining your own Structure classes is straightforward, just inherit from BaseStructure and implement agenerate_positions() method:

class MyStructure(BaseStructure):parameter_names = ("spam", "eggs")

def __init__(self, spam=3, eggs=1):...

def generate_positions(self, n):...# must return a 3xn numpy array

To define your own Shape class for use with RandomStructure, subclass Shape and implement a sample()method:

class Tetrahedron(Shape):

def __init__(self, side_length):...

(continues on next page)

4.3. Representing spatial structure and calculating distances 23

Page 28: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

(continued from previous page)

def sample(self, n, rng):...# return a nx3 numpy array.

Note: rotation of structures is currently missing, but is planned for a future release.

24 Chapter 4. Building networks

Page 29: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

CHAPTER 5

Injecting current

Current waveforms are represented in PyNN by CurrentSource classes. There are four built-in source types, andit is straightforward to implement your own.

There are two ways to inject a current waveform into the cells of a Population, PopulationView orAssembly: either the inject_into() method of the CurrentSource or the inject() method of thePopulation, Assembly, etc.

>>> pulse = DCSource(amplitude=0.5, start=20.0, stop=80.0)>>> pulse.inject_into(population[3:7])

>>> sine = ACSource(start=50.0, stop=450.0, amplitude=1.0, offset=1.0,... frequency=10.0, phase=180.0)>>> population.inject(sine)

25

Page 30: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

>>> steps = StepCurrentSource(times=[50.0, 110.0, 150.0, 210.0],... amplitudes=[0.4, 0.6, -0.2, 0.2])>>> steps.inject_into(population[(6,11,27)])

>>> noise = NoisyCurrentSource(mean=1.5, stdev=1.0, start=50.0, stop=450.0, dt=1.0)>>> population.inject(noise)

For a full description of all the built-in current source classes, see the API reference.

26 Chapter 5. Injecting current

Page 31: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

Todo: write “implementing-your-own-current-source” (e.g., implement “chirp”)

27

Page 32: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

28 Chapter 5. Injecting current

Page 33: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

CHAPTER 6

Recording spikes and state variables

It is possible to record the times of action potentials, and the values of state variables, of any neuron in the network.Recording state variables of dynamic synapse models is not yet supported.

The classes Population, PopulationView and Assembly all have a record() method, which takes eithera single variable name or a list/tuple of such names, and which sets up recording of the requested variables for allneurons in the population:

>>> population.record(['v', 'spikes']) # record membrane potential and spikes from→˓all neurons in the population>>> assembly.record('spikes') # record spikes from all neurons in multiple→˓populations

To record from only a subset of the neurons in a Population, we create a temporary PopulationView usingindexing or the sample() method and call record() on this view:

>>> population.sample(10).record('v') # record membrane→˓potential from 10 neurons chosen at random>>> population[[0,1,2]].record(['v', 'gsyn_exc', 'gsyn_inh']) # record several→˓variables from specific neurons

To find out what variable names are available for a given neuron model, inspect the recordable attribute of thepopulation’s celltype attribute:

>>> population.celltypeEIF_cond_alpha_isfa_ista(<parameters>)>>> population.celltype.recordable['spikes', 'v', 'w', 'gsyn_exc', 'gsyn_inh']

By default, variables are recorded at every time step. It is possible to record at a lower frequency using thesampling_interval argument, e.g.:

>>> population.record(None) # reset recording for this population>>> population.record('v', sampling_interval=1.0)

29

Page 34: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

You should ensure that the sampling interval is an integer multiple of the simulation time step. Other values may work,but have not been tested.

An alternative syntax is available, using the top-level record() function:

>>> record(['v', 'spikes'], population, filename="output_data.pkl", sampling_→˓interval=1.0)

This avoids having to call population.write_data() at the end of the simulation; the data will be automaticallywritten to the specified filename.

30 Chapter 6. Recording spikes and state variables

Page 35: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

CHAPTER 7

Data handling

Todo: add a note that data handling has changed considerably since 0.7, and give link to detailed changelog.

Recorded data in PyNN is always associated with the Population or Assembly from which it was recorded.Data may either be written to file, using the write_data() method, or retrieved as objects in memory, usingget_data().

7.1 Retrieving recorded data

Handling of recorded data in PyNN makes use of the Neo package, which provides a common Python data model forneurophysiology data (whether real or simulated).

The get_data() method returns a Neo Block object. This is the top-level data container, which contains one ormore Segments. Each Segment is a container for data sharing a common time basis - a new Segment is addedevery time the reset() function is called.

A Segment can contain lists of AnalogSignal and SpikeTrain objects. These data objects inherit fromNumPy’s array class, and so can be treated in further processing (analysis, visualization, etc.) in exactly the sameway as NumPy arrays, but in addition they carry metadata about units, sampling interval, etc.

Here is a complete example of recording and plotting data from a simulation:

import pyNN.neuron as sim # can of course replace `neuron` with `nest`, `brian`, etc.import matplotlib.pyplot as pltimport numpy as np

sim.setup(timestep=0.01)p_in = sim.Population(10, sim.SpikeSourcePoisson(rate=10.0), label="input")p_out = sim.Population(10, sim.EIF_cond_exp_isfa_ista(), label="AdExp neurons")

syn = sim.StaticSynapse(weight=0.05)random = sim.FixedProbabilityConnector(p_connect=0.5)

(continues on next page)

31

Page 36: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

(continued from previous page)

connections = sim.Projection(p_in, p_out, random, syn, receptor_type='excitatory')

p_in.record('spikes')p_out.record('spikes') # record spikes from all neuronsp_out[0:2].record(['v', 'w', 'gsyn_exc']) # record other variables from first two→˓neurons

sim.run(500.0)

spikes_in = p_in.get_data()data_out = p_out.get_data()

fig_settings = {'lines.linewidth': 0.5,'axes.linewidth': 0.5,'axes.labelsize': 'small','legend.fontsize': 'small','font.size': 8

}plt.rcParams.update(fig_settings)plt.figure(1, figsize=(6, 8))

def plot_spiketrains(segment):for spiketrain in segment.spiketrains:

y = np.ones_like(spiketrain) * spiketrain.annotations['source_id']plt.plot(spiketrain, y, '.')plt.ylabel(segment.name)plt.setp(plt.gca().get_xticklabels(), visible=False)

def plot_signal(signal, index, colour='b'):label = "Neuron %d" % signal.annotations['source_ids'][index]plt.plot(signal.times, signal[:, index], colour, label=label)plt.ylabel("%s (%s)" % (signal.name, signal.units._dimensionality.string))plt.setp(plt.gca().get_xticklabels(), visible=False)plt.legend()

n_panels = sum(a.shape[1] for a in data_out.segments[0].analogsignals) + 2plt.subplot(n_panels, 1, 1)plot_spiketrains(spikes_in.segments[0])plt.subplot(n_panels, 1, 2)plot_spiketrains(data_out.segments[0])panel = 3for array in data_out.segments[0].analogsignals:

for i in range(array.shape[1]):plt.subplot(n_panels, 1, panel)plot_signal(array, i, colour='bg'[panel % 2])panel += 1

plt.xlabel("time (%s)" % array.times.units._dimensionality.string)plt.setp(plt.gca().get_xticklabels(), visible=True)

plt.show()

32 Chapter 7. Data handling

Page 37: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

The adoption of Neo as an output representation also makes it easier to handle data when running multiple simulationswith the same network, calling reset() between each run. In previous versions of PyNN it was necessary to retrievethe data before every reset(), and take care of storing the resulting data. Now, each run just creates a new NeoSegment, and PyNN takes care of storing the data until it is needed. This is illustrated in the example below.

import pyNN.neuron as sim # can of course replace `nest` with `neuron`, `brian`, etc.import matplotlib.pyplot as pltfrom quantities import nA

sim.setup()

(continues on next page)

7.1. Retrieving recorded data 33

Page 38: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

(continued from previous page)

cell = sim.Population(1, sim.HH_cond_exp())step_current = sim.DCSource(start=20.0, stop=80.0)step_current.inject_into(cell)

cell.record('v')

for amp in (-0.2, -0.1, 0.0, 0.1, 0.2):step_current.amplitude = ampsim.run(100.0)sim.reset(annotations={"amplitude": amp * nA})

data = cell.get_data()

sim.end()

for segment in data.segments:vm = segment.analogsignals[0]plt.plot(vm.times, vm,

label=str(segment.annotations["amplitude"]))plt.legend(loc="upper left")plt.xlabel("Time (%s)" % vm.times.units._dimensionality)plt.ylabel("Membrane potential (%s)" % vm.units._dimensionality)

plt.show()

34 Chapter 7. Data handling

Page 39: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

Note: if you still want to retrieve the data after every run you can do so: just call get_data(clear=True)

7.2 Writing data to file

Neo provides support for writing to a variety of different file formats, notably an assortment of text-based formats,NumPy binary format, Matlab .mat files, and HDF5. To write to a given format, we create a Neo IO object and passit to the write_data() method:

>>> from neo.io import NixIO>>> io = NixIO(filename="my_data.h5")>>> population.write_data(io)

As a shortcut, for file formats with a well-defined file extension, it is possible to pass just the filename, and PyNN willcreate the appropriate IO object for you:

>>> population.write_data("my_data.mat") # writes to a Matlab file

By default, all the variables that were specified in the record() call will be saved to file, but it is also possible tosave only a subset of the recorded data:

>>> population.write_data(io, variables=('v', 'gsyn_exc'))

When running distributed simulations using MPI (see Running parallel simulations), by default the data is gatheredfrom all MPI nodes to the master node, and only saved to file on the master node. If you would prefer that each nodesaves its own local subset of the data to disk separately, use gather=False:

>>> population.write_data(io, gather=False)

Saving data to a file does not delete the data from the Population object. If you wish to do so (for example torelease memory), use clear=True:

>>> population.write_data(io, clear=True)

7.3 Simple plotting

Plotting Neo data with Matplotlib, as shown above, can be rather verbose, with a lot of repetitive boilerplate code.PyNN therefore provides a couple of classes, Figure and Panel, to make quick-and-easy plots of recorded data.It is possible to customize the plots to some extent, but for publication-quality or highly-customized plots you shouldprobably use Matplotlib or some other plotting package directly.

A simple example:

from pyNN.utility.plotting import Figure, Panel

...

population.record('spikes')population[0:2].record(('v', 'gsyn_exc'))

...

(continues on next page)

7.2. Writing data to file 35

Page 40: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

(continued from previous page)

data = population.get_data().segments[0]

vm = data.filter(name="v")[0]gsyn = data.filter(name="gsyn_exc")[0]

Figure(Panel(vm, ylabel="Membrane potential (mV)"),Panel(gsyn, ylabel="Synaptic conductance (uS)"),Panel(data.spiketrains, xlabel="Time (ms)", xticks=True)

).save("simulation_results.png")

36 Chapter 7. Data handling

Page 41: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

7.4 Other packages for working with Neo data

A variety of software tools are available for working with Neo-format data, for example SpykeViewer and OpenElec-trophy.

7.4. Other packages for working with Neo data 37

Page 42: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

38 Chapter 7. Data handling

Page 43: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

CHAPTER 8

Simulation control

8.1 Initialising the simulator

Before using any other functions or classes from PyNN, the user must call the setup() function:

>>> setup()

setup() takes various optional arguments: setting the simulation timestep (there is currently no support in the APIfor variable timestep methods although native simulator code can be used to select this option where the simulatorsupports it) and setting the minimum and maximum synaptic delays, e.g.:

>>> setup(timestep=0.1, min_delay=0.1, max_delay=10.0)

Calling setup() a second time resets the simulator entirely, destroying any network that may have been created inthe meantime.

Todo: add links to documentation on simulator-specific options to setup()

8.2 Getting information about the simulation state

Several functions are available for obtaining information about the simulation state:

• get_current_time() - the time within the simulation

• get_time_step() - the integration time step

• get_min_delay() - the minimum allowed synaptic delay

• get_max_delay() - the maximum allowed synaptic delay

• num_processes() - the number of MPI processes

• rank() - the MPI rank of the current node

39

Page 44: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

8.3 Running a simulation

The run() function advances the simulation for a given number of milliseconds, e.g.:

>>> run(1000.0)

You can also use run_for(), which is an alias for run(). The run_until() function advances the simulationuntil a given future time point, e.g.:

>>> run_until(1001.0)>>> get_current_time()1001.0

8.3.1 Performing operations during a run

You may wish to perform some calculation, or show some information, during a run. One way to do this is to breakthe simulation into steps, and perform the operation at the end of each step, e.g.:

>>> for i in range(4):... run_until(100.0*i)... print("The time is %g" % (100*i,))The time is 0The time is 100The time is 200The time is 300

Alternatively, PyNN can take care of breaking the simulation into steps for you. run() and run_until() eachaccept an optional list of callbacks functions. Each callback should accept the current time as an argument, and returnthe next time it wishes to be called.

>>> def report_time(t):... print("The time is %g" % t)... return t + 100.0>>> run_until(300.0, callbacks=[report_time])The time is 0The time is 100The time is 200The time is 300300.0

For simple cases, this requires a bit more code, but it is potentially much more powerful, especially if you havecomplex or multiple callbacks.

8.4 Repeating a simulation

If you wish to reset network time to zero to run a new simulation with the same network (with different parametervalues, perhaps), use the reset() function. Note that this does not change the network structure, nor the choice ofwhich neurons to record (from previous record() calls).

40 Chapter 8. Simulation control

Page 45: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

8.5 Finishing up

Just as a simulation must be begun with a call to setup(), it should be ended with a call to end(). This is notalways necessary, but it is safest to always use it.

8.5. Finishing up 41

Page 46: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

42 Chapter 8. Simulation control

Page 47: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

CHAPTER 9

Model parameters and initial values

As was discussed in Building networks, PyNN deals with neurons, and with the synaptic connections between them,principally at the level of groups: with Population and Assembly for neurons and Projection for connections.

Setting the parameters of neurons and connections is also done principally at the group level, either when creatingthe group, or after creation using the set() method. Sometimes, all the neurons in a Population or all theconnections in a Projection should have the same value. Other times, different individual cells or connectionsshould have different parameter values. To handle both of these situations, parameter values may be of four differenttypes:

• a single number - sets the same value for all cells in the Population or connections in the Projection

• a RandomDistribution object (see Random numbers) - each item in the group will have the parameter setto a value drawn from the distribution

• a list or 1D NumPy array - of the same size as the Population or the number of connections in theProjection

• a function - for a Population or Assembly the function should take a single integer argument, and will becalled with the index of every neuron in the Population to return the parameter value for that neuron. Fora Projection, the function should take two integer arguments, and for every connection will be called withthe indices of the pre- and post-synaptic neurons.

9.1 Examples

9.1.1 Setting the same value for all neurons in a population

>>> p = Population(5, IF_cond_exp(tau_m=15.0))

or, equivalently:

>>> p = Population(5, IF_cond_exp())>>> p.set(tau_m=15.0)

43

Page 48: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

To set values for a subset of the population, use a view:

>>> p[0,2,4].set(tau_m=10.0)>>> p.get('tau_m')array([ 10., 15., 10., 15., 10.])

9.1.2 Setting parameters to random values

>>> from pyNN.random import RandomDistribution, NumpyRNG>>> gbar_na_distr = RandomDistribution('normal', (20.0, 2.0),→˓rng=NumpyRNG(seed=85524))>>> p = Population(7, HH_cond_exp(gbar_Na=gbar_na_distr))>>> p.get('gbar_Na')array([ 20.03132455, 20.09777627, 16.97079318, 17.44786923,

19.4928947 , 20.80321881, 19.97246906])>>> p[0].gbar_Na20.031324546935146

9.1.3 Setting parameters from an array

>>> import numpy as np>>> p = Population(6, SpikeSourcePoisson(rate=np.linspace(10.0, 20.0, num=6)))>>> p.get('rate')array([ 10., 12., 14., 16., 18., 20.])

The array of course has to have the same size as the population:

>>> p = Population(6, SpikeSourcePoisson(rate=np.linspace(10.0, 20.0, num=7)))ValueError

9.1.4 Using a function to calculate parameter values

>>> from numpy import sin, pi>>> p = Population(8, IF_cond_exp(i_offset=lambda i: sin(i*pi/8)))>>> p.get('i_offset')array([ 0. , 0.38268343, 0.70710678, 0.92387953, 1. ,

0.92387953, 0.70710678, 0.38268343])

9.1.5 Setting parameters as a function of spatial position

>>> from pyNN.space import Grid2D>>> grid = Grid2D(dx=10.0, dy=10.0)>>> p = Population(16, IF_cond_alpha(), structure=grid)>>> def f_v_thresh(pos):... x, y, z = pos.T... return -50 + 0.5*x - 0.2*y>>> p.set(v_thresh=lambda i: f_v_thresh(p.position_generator(i)))>>> p.get('v_thresh').reshape((4,4))array([[-50., -52., -54., -56.],

[-45., -47., -49., -51.],

(continues on next page)

44 Chapter 9. Model parameters and initial values

Page 49: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

(continued from previous page)

[-40., -42., -44., -46.],[-35., -37., -39., -41.]])

For more on spatial structure, see Representing spatial structure and calculating distances.

9.1.6 Using multiple parameter types

It is perfectly possible to use multiple different types of parameter value at the same time:

>>> n = 1000>>> parameters = {... 'tau_m': RandomDistribution('uniform', (10.0, 15.0)),... 'cm': 0.85,... 'v_rest': lambda i: np.cos(i*pi*10/n),... 'v_reset': np.linspace(-75.0, -65.0, num=n)}>>> p = Population(n, IF_cond_alpha(**parameters))>>> p.set(v_thresh=lambda i: -65 + i/n, tau_refrac=5.0)

Todo: in the above, give current source examples, and Projection examples

9.2 Time series and array-valued parameters

For certain neuron models (SpikeSourceArray, GIF_cond_exp) and current sources, the individual parametervalues are not single numbers (with physical units), but arrays, e.g.:

celltype = SpikeSourceArray(np.array([5.0, 15.0, 45.0, 99.0]))

to set the same spike times for the entire population. To set different spike times for each cell in the population requiresan array of arrays. To avoid ambiguities in this situation, the inner arrays should be wrapped by the Sequence class,e.g.:

celltype = SpikeSourceArray([Sequence([5.0, 15.0, 45.0, 99.0]),Sequence([2.0, 5.3, 18.9]),Sequence([17.8, 88.2, 100.1])

])

Such an array-of-Sequences can also be provided by a generator function, e.g.:

number = int(2 * simtime * input_rate / 1000.0)

def generate_spike_times(i):gen = lambda: Sequence(numpy.add.accumulate(numpy.random.exponential(1000.0 /

→˓input_rate, size=number)))if hasattr(i, "__len__"):

return [gen() for j in i]else:

return gen()

celltype = SpikeSourceArray(spike_times=generate_spike_times)

9.2. Time series and array-valued parameters 45

Page 50: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

As a generalization of Sequence, some models require array-valued parameters, expressed as tuples orArrayParameter instances, e.g.:

cell_type = GIF_cond_exp(...# this parameter has the same value in all neurons in the populationtau_gamma=(1.0, 10.0, 100.0), # Time constants for spike-frequency adaptation in

→˓ms.# the following parameter has different values for each neurona_eta=[(0.1, 0.1, 0.1), # Post-spike increments for spike-triggered

→˓current in nA(0.0, 0.0, 0.0),(0.0, 0.0, 0.0),(0.0, 0.0, 0.0)]

...)

Note: The reason for defining Sequence and ArrayParameter rather than just using a plain NumPy array isto avoid the ambiguity of “is a given array a single parameter value (e.g. a spike train for one cell) or an array ofparameter values (e.g. one number per cell)?”.

9.3 Setting initial values

Todo: complete

Note: For most neuron types, the default initial value for the membrane potential is the same as the default valuefor the resting membrane potential parameter. However, be aware that changing the value of the resting membranepotential will not automatically change the initial value.

46 Chapter 9. Model parameters and initial values

Page 51: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

CHAPTER 10

Random numbers

There are four considerations for random number generation and consumption in PyNN:

Reproducibility: When comparing simulations with different backends, we may wish to ensure that allbackends use the same sequence of random numbers so that the only differences between simulationsarise from the numerics of the simulators.

Performance: All simulators have their own built-in facilities for random number generation, and it maybe faster to use these than to use random numbers generated by PyNN.

Distributed simulations: When distributing simulations across multiple processors using MPI, we maywish to ensure that the sequence of random numbers is independent of the number of computationnodes.

Quality: Different models have different requirements for the quality of the (pseudo-)random numbergenerator used. For models that are not strongly dependent on this, we may wish to use a generatorthat is faster but has lower-quality. For models that are highly sensitive, a slower but higher-qualitygenerator may be desired.

Because of these considerations, PyNN aims to provide a great deal of flexibility in specifying random number gener-ation for those who need it, while hiding the details entirely for those who do not.

10.1 RNG classes

All functions and methods in the PyNN API that can make use of random numbers have an optional rng argument,which should be an instance of a subclass of pyNN.random.AbstractRNG. PyNN provides three such sub-classes:

NumpyRNG: Uses the numpy.random.RandomState class (Mersenne Twister).

GSLRNG: Uses the GNU Scientific Library random number generators.

NativeRNG: Signals that the simulator’s own built-in RNG should be used.

If you wish to use your own random number generator, it is reasonably straightforward to do so: see Random numbersin the API reference.

47

Page 52: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

Note: If the rng argument is not supplied (or is None), then the method or function creates a new NumpyRNG for itsown use.

All RNG classes accept a seed argument, and a parallel_safe argument. The latter is True by default, and ensures thatthe simulation results will not depend on the number of MPI nodes in a distributed simulation. This independencecan be computationally costly, however, so it is possible to set parallel_safe=False, accepting that the results will bedependent on the number of nodes, in order to get better performace.

Note: parallel_safe may or may not have any effect when using a NativeRNG, depending on the simulator.

10.1.1 The next() method

Apart from the constructor, RNG classes have only one important method: next(), which returns a NumPy arraycontaining random numbers from the requested distribution:

>>> rng = NumpyRNG(seed=824756)>>> rng.next(5, 'normal', {'mu': 1.0, 'sigma': 0.2})array([ 0.65866423, 0.87500017, 0.90755753, 0.93793779, 0.94839735])>>> rng = GSLRNG(seed=824756, type='ranlxd2') # RANLUX algorithm of Luescher>>> rng.next(5, 'normal', {'mu': 1.0, 'sigma': 0.2})array([ 0.61104097, 0.83086026, 0.87072741, 0.7513628 , 1.12875371])

In versions of PyNN prior to 0.8, distribution names and parameterisations were not standardized: e.g. GSLRNGneeded ‘gaussian’ rather than ‘normal’. As of PyNN 0.8, the following standardized names are used:

Name Parameters Commentsbinomial n, pgamma k, thetaexponential betalognormal mu, sigmanormal mu, sigmanormal_clipped mu, sigma, low, high Values outside (low, high) are redrawnnormal_clipped_to_boundary mu, sigma, low, high Values below/above low/high are set to low/highpoisson lambdauniform low, highuniform_int low, highvonmises mu, kappa

10.2 The RandomDistribution class

The RandomDistribution class encapsulates a choice of random number generator and a choice of distribution,so that its next() method requires only the number of values required as argument:

>>> gamma = RandomDistribution('gamma', (2.0, 0.3), rng=NumpyRNG(seed=72386))>>> gamma.next(5)array([ 0.4325809 , 0.12952503, 1.58510406, 0.81182457, 0.07577787])

You can alternatively provide parameter names as keyword arguments, e.g.:

48 Chapter 10. Random numbers

Page 53: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

>>> gamma = RandomDistribution('gamma', k=2.0, theta=0.3, rng=NumpyRNG(seed=72386))

Note that next() called without any arguments returns a single number, not an array:

>>> gamma.next()0.52020946027308368>>> gamma.next(1)array([ 0.4863944])

Note: the apparent difference in precision between the single number and the array is not real: NumPy only displaysa limited number of digits but the numbers in the array have full precision.

10.2. The RandomDistribution class 49

Page 54: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

50 Chapter 10. Random numbers

Page 55: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

CHAPTER 11

Backends

The PyNN API provides a uniform interface to different simulators, but nevertheless each simulator has features thatare not available in other simulators, and we aim to make these features accessible, as much as possible, from PyNN.

For each simulator backend, this section presents the configuration options specific to that backend and explains howto use “native” neuron and synapse models within the PyNN framework.

11.1 NEURON

11.1.1 Configuration options

Adaptive time step integration

The default integration method used by the pyNN.neuron backend uses a fixed time step, specified by the timestepargument to the setup() function.

NEURON also supports use of variable time step methods, which can improve simulation speed:

setup(use_cvode=True)

If using cvode, there are two more optional parameters:

setup(use_cvode=True,rtol=0.001, # specify relative error toleranceatol=1e-4) # specify absolute error tolerance

If not specified, the default values are rtol = 0 and atol = 0.001. For full details, see the CVode documentation

Todo: native_rng_baseseed is added to MPI.rank to form seed for SpikeSourcePoisson, etc., but I think it would bebetter to add a seed parameter to SpikeSourcePoisson

51

Page 56: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

Todo: Population.get_data() does not yet handle cvode properly.

11.1.2 Using native cell models

A native NEURON cell model is described using a Python class (which may wrap a Hoc template). For this class towork with PyNN, there are a small number of requirements:

• the __init__() method should take just **parameters as its argument.

• instances should have attributes:

– source: a reference to the membrane potential which will be monitored for spike emission, e.g.self.soma(0.5)._ref_v

– source_section: the Hoc Section in which source is located.

– parameter_names: a tuple of the names of attributes/properties of the class that correspond to pa-rameters of the model.

– traces: an empty dict, used for recording.

– recording_time: should be False initially.

• there must be a memb_init() method, taking no arguments.

Here is an example, which uses the nrnutils package for conciseness:

from nrnutils import Mechanism, Section

class SimpleNeuron(object):

def __init__(self, **parameters):hh = Mechanism('hh', gl=parameters['g_leak'], el=-65,

gnabar=parameters['gnabar'], gkbar=parameters['gkbar'])self.soma = Section(L=30, diam=30, mechanisms=[hh])self.soma.add_synapse('ampa', 'Exp2Syn', e=0.0, tau1=0.1, tau2=5.0)

# needed for PyNNself.source_section = self.somaself.source = self.soma(0.5)._ref_vself.parameter_names = ('g_leak', 'gnabar', 'gkbar')self.traces = {}self.recording_time = False

def _set_gnabar(self, value):for seg in self.soma:

seg.hh.gnabar = valuedef _get_gnabar(self):

return self.soma(0.5).hh.gnabargnabar = property(fget=_get_gnabar, fset=_set_gnabar)

# ... gkbar and g_leak properties defined similarly

def memb_init(self):for seg in self.soma:

seg.v = self.v_init

For each cell model, you must also define a cell type:

52 Chapter 11. Backends

Page 57: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

from pyNN.neuron import NativeCellType

class SimpleNeuronType(NativeCellType):default_parameters = {'g_leak': 0.0002, 'gkbar': 0.036, 'gnabar': 0.12}default_initial_values = {'v': -65.0}recordable = ['soma(0.5).v', 'soma(0.5).ina']units = {'soma(0.5).v' : 'mV', 'soma(0.5).ina': 'nA'}receptor_types = ['soma.ampa']model = SimpleNeuron

The requirement to explicitly list all variables you might wish to record in the recordable attribute is a temporaryinconvenience, which will be removed in a future version.

It is now straightforward to use this cell type in PyNN:

from pyNN.neuron import setup, run, Population, Projection, AllToAllConnector,→˓StaticSynapsesetup()p1 = Population(10, SimpleNeuronType(g_leak=0.0003))p1.record('soma(0.5).ina')syn = StaticSynapse(weight=0.01, delay=0.5)prj = Projection(p1, p1, AllToAllConnector(), syn, receptor_type='soma.ampa')run(100.0)output = p1.get_data()

If your model relies on other NMODL mechanisms, call the load_mechanisms() function with the path to thedirectory containing the .mod files.

It is also possible to use NEURON “ARTIFICIAL_CELL” models, such as IntFire1, IntFire2 andIntFire4:

from pyNN.neuron import setup, Population, IntFire1setup()p1 = Population(10, IntFire1(tau=10.0, refrac=2.5))p1.record('m')

11.2 NEST

11.2.1 Configuration options

Continuous time spiking

In traditional simulation schemes spikes are constrained to an equidistant time grid. However, for some neuron models,NEST has the capability to represent spikes in continuous time.

At setup the user can choose the continuous time scheme

setup(spike_precision='off_grid')

or the conventional grid-constrained scheme

setup(spike_precision='on_grid')

where ‘off_grid’ is the default.

11.2. NEST 53

Page 58: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

The following PyNN standard models have an off-grid implementation: IF_curr_exp, SpikeSourcePoissonEIF_cond_alpha_isfa_ista.

Todo: add a list of native NEST models with off-grid capability

Here is an example showing how to specify the option in a PyNN script and an illustration of the different outcomes:

import numpyfrom pyNN.nest import *import matplotlib.pyplot as plt

def test_sim(on_or_off_grid, sim_time):setup(timestep=1.0, min_delay=1.0, max_delay=1.0, spike_precision=on_or_off_grid)src = Population(1, SpikeSourceArray(spike_times=[0.5]))cm = 250.0tau_m = 10.0tau_syn_E = 1.0weight = cm / tau_m * numpy.power(tau_syn_E / tau_m, -tau_m / (tau_m - tau_syn_

→˓E)) * 20.5nrn = Population(1, IF_curr_exp(cm=cm, tau_m=tau_m, tau_syn_E=tau_syn_E,

tau_refrac=2.0, v_thresh=20.0, v_rest=0.0,v_reset=0.0, i_offset=0.0))

nrn.initialize(v=0.0)prj = Projection(src, nrn, OneToOneConnector(), StaticSynapse(weight=weight))nrn.record('v')run(sim_time)return nrn.get_data().segments[0].analogsignals[0]

sim_time = 10.0off = test_sim('off_grid', sim_time)on = test_sim('on_grid', sim_time)

def plot_data(pos, on, off, ylim, with_legend=False):ax = plt.subplot(1, 2, pos)ax.plot(off.times, off, color='0.7', linewidth=7, label='off-grid')ax.plot(on.times, on, 'k', label='on-grid')ax.set_ylim(*ylim)ax.set_xlim(0, 9)ax.set_xlabel('time [ms]')ax.set_ylabel('V [mV]')if with_legend:

plt.legend()

plot_data(1, on, off, (-0.5, 21), with_legend=True)plot_data(2, on, off, (-0.05, 2.1))plt.show()

54 Chapter 11. Backends

Page 59: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

The gray curve shows the membrane potential excursion in response to an input spike arriving at the neuron at t =1.5 ms (left panel, the right panel shows an enlargement at low voltages). The amplitude of the post-current has anunrealistically high value such that the threshold voltage for spike generation is crossed. The membrane potential isrecorded in intervals of 1 ms. Therefore the first non-zero value is measured at t = 2 ms. The threshold is crossedsomewhere in the interval (3 ms, 4 ms], resulting in a voltage of 0 at t = 4 ms. The membrane potential is clampedto 0 for 2 ms, the refractory period. Therefore, the neuron recovers from refractoriness somewhere in the interval (5ms, 6 ms] and the next non-zero voltage is observed at t = 6 ms. The black curve shows the results of the same modelnow integrated with a grid constrained simulation scheme with a computation step size of 1 ms. The input spike ismapped to the next grid position and therefore arrives at t = 2 ms. The first non-zero voltage is observed at t = 3 ms.The output spike is emitted at t = 4 ms and this is the time at which the membrane potential is reset. Consequently, themodel neuron returns from refractoriness at exactly t = 6 ms. The next non-zero membrane potential value is observedat t = 7 ms.

The following publication describes how the continuous time mode is implemented in NEST and compares the per-formance of different approaches:

Hanuschkin A, Kunkel S, Helias M, Morrison A and Diesmann M (2010) A general and efficient methodfor incorporating precise spike times in globally time-driven simulations. Front. Neuroinform. 4:113.doi:10.3389/fninf.2010.00113

11.2.2 Using native cell models

To use a NEST neuron model with PyNN, we wrap the NEST model with a PyNN NativeCellType class, e.g.:

11.2. NEST 55

Page 60: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

>>> from pyNN.nest import native_cell_type, Population, run, setup>>> setup()0>>> ht_neuron = native_cell_type('ht_neuron')>>> poisson = native_cell_type('poisson_generator')>>> p1 = Population(10, ht_neuron(Tau_m=20.0))>>> p2 = Population(1, poisson(rate=200.0))

We can now initialize state variables, set/get parameter values, and record from these neurons as from standard cells:

>>> p1.get('Tau_m')20.0>>> p1.get('Tau_theta')2.0>>> p1.get('C_m')Traceback (most recent call last):...NonExistentParameterError: C_m (valid parameters for ht_neuron are:

AMPA_E_rev, AMPA_Tau_1, AMPA_Tau_2, AMPA_g_peak, E_K, E_Na, GABA_A_E_rev,GABA_A_Tau_1, GABA_A_Tau_2, GABA_A_g_peak, GABA_B_E_rev, GABA_B_Tau_1,GABA_B_Tau_2, GABA_B_g_peak, KNa_E_rev, KNa_g_peak, NMDA_E_rev, NMDA_Sact,NMDA_Tau_1, NMDA_Tau_2, NMDA_Vact, NMDA_g_peak, NaP_E_rev, NaP_g_peak,T_E_rev, T_g_peak, Tau_m, Tau_spike, Tau_theta, Theta_eq, g_KL, g_NaL,h_E_rev, h_g_peak, spike_duration)

>>> p1.initialize(V_m=-70.0, Theta=-50.0)>>> p1.record('V_m')>>> run(250.0)250.0>>> output = p1.get_data()

To connect populations of native cells, you need to know the available synaptic receptor types:

>>> ht_neuron.receptor_types['NMDA', 'AMPA', 'GABA_A', 'GABA_B']>>> from pyNN.nest import Projection, AllToAllConnector>>> connector = AllToAllConnector()>>> prj_ampa = Projection(p2, p1, connector, receptor_type='AMPA')>>> prj_nmda = Projection(p2, p1, connector, receptor_type='NMDA')

11.2.3 Using native synaptic plasticity models

To use a NEST STDP model with PyNN, we use the native_synapse_type() function:

>>> from pyNN.nest import native_synapse_type>>> stdp = native_synapse_type("stdp_synapse")(**{"Wmax": 50.0, "lambda": 0.015})>>> prj_plastic = Projection(p1, p1, connector, receptor_type='AMPA', synapse_→˓type=stdp)

Common synapse properties

Some NEST synapse models (e.g. stdp_facetshw_synapse_hom) make use of common synapse properties toconserve memory. This has the following implications for their usage in PyNN:

• Common properties can only have one homogeneous value per projection. Trying to assign heterogeneousvalues will result in a ValueError.

56 Chapter 11. Backends

Page 61: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

• Common properties can currently not be retrieved using Projection.get. However, they will only deviatefrom the default when changed manually.

11.3 Brian

11.4 NeMo

11.5 MOOSE

11.6 NeuroML

A subset of models specified in PyNN can be exported into NeuroML 2 format.

See https://github.com/NeuroML/NeuroML2/issues/73 for latest status.

11.7 NineML

The NineML backend is described in ../nineml

11.8 Neuromorphic hardware

11.3. Brian 57

Page 62: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

58 Chapter 11. Backends

Page 63: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

CHAPTER 12

Running parallel simulations

Where the underlying simulator supports distributed simulations, in which the computations are spread over multipleprocessors using MPI (this is the case for NEURON and NEST), PyNN also supports this. To run a distributedsimulation on eight nodes, the command will be something like:

$ mpirun -np 8 -machinefile ~/mpi_hosts python myscript.py

Depending on the implementation of MPI you have, mpirun could be replaced by mpiexec or another command,and the options may also be somewhat different.

For NEURON only, you can also run distributed simulations using nrniv instead of the python executable:

$ mpirun -np 8 -machinefile ~/mpi_hosts nrniv -python -mpi myscript.py

12.1 Additional requirements

First, make sure you have compiled the simulators you wish to use with MPI enabled. There is usually a configure flagcalled something like “--with-mpi” to do this, but see the installation documentation for each simulator for details.

If you wish to use the default “gather” feature (see below), which automatically gathers output data from all the nodesto the master node (the one on which you launched the simulation), you will need to install the mpi4py module (seehttp://mpi4py.scipy.org/ for downloads and documentation). Installation is usually very straightforward, although, ifyou have more than one MPI implementation installed on your system (e.g. OpenMPI and MPICH2), you must besure to build mpi4py with the same MPI implementation that you used to build the simulator.

12.2 Code modifications

In most cases, no modifications to your code should be necessary to run in parallel. PyNN, and the simulator, takecare of distributing the computations between nodes. Furthermore, the default settings should give results that areindependent of the number of processors used, even when using random numbers.

59

Page 64: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

12.3 Gathering data to the master node

The various methods of the Population and Assembly classes that deal with accessing recorded data or writing itto disk, such as get_data(), write_data(), etc., have an optional argument gather, which is True by default.

If gather is True, then data generated on other nodes is sent to the master node. This means that, for example,write_data() will create only a single file, on the filesystem of the master node. If gather is False, each nodewill write a file on its local filesystem. This option is often desirable if you wish to do distributed post-processing ofthe data. (Don’t worry, by the way, if you are using a shared filesystem such as NFS. If gather is False then the MPIrank is appended to the filename, so there is no chance of conflict between the different nodes).

12.4 Random number generators

In general, we expect that our results should not depend on the number of processors used to produce them. If oursimulations use random numbers in setting-up or running the network, this means that each object that uses randomnumbers should receive the same sequence independent of which node it is on or how many nodes there are. PyNNachieves this by ensuring the generator seed is the same on all nodes, and then generating as many random numbersas would be used in the single-processor case and throwing away those that are not needed.

This obviously has a potential impact on performance, and so it is possible to turn it off by passing parallel_safe=Falseas argument when creating the random number generator, e.g.:

>>> from pyNN.random import NumpyRNG>>> rng = NumpyRNG(seed=249856, parallel_safe=False)

Now, PyNN will ensure the seed is different on each node, and will generate only as many numbers as are actuallyneeded on each node.

Note that the above applies only to the random number generators provided by the pyNN.random module,not to the native RNGs used internally by each simulator. This means that, for example, you should pre-fer SpikeSourceArray (for which you can generate Poisson spike times using a parallel-safe RNG) toSpikeSourcePoisson, which uses the simulator’s internal RNG, if you care about being independent of thenumber of processors.

60 Chapter 12. Running parallel simulations

Page 65: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

CHAPTER 13

Units

PyNN does not at present support explicit specification of the units of physical quantities (parameters and initialvalues). Instead, the following convention is used:

Physical quantity Unitstime msvoltage mVcurrent nAconductance µScapacitance nFfiring rate /sphase/angle deg

61

Page 66: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

62 Chapter 13. Units

Page 67: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

CHAPTER 14

Examples

14.1 A selection of Izhikevich neurons

"""A selection of Izhikevich neurons.

Run as:

$ python Izhikevich.py <simulator>

where <simulator> is 'neuron', 'nest', etc.

(continues on next page)

63

Page 68: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

(continued from previous page)

"""

from numpy import arangefrom pyNN.utility import get_simulator, init_logging, normalized_filename

# === Configure the simulator ================================================

sim, options = get_simulator(("--plot-figure", "Plot the simulation results to a file.→˓", {"action": "store_true"}),

("--debug", "Print debugging information"))

if options.debug:init_logging(None, debug=True)

sim.setup(timestep=0.01, min_delay=1.0)

# === Build and instrument the network =======================================

neurons = sim.Population(3, sim.Izhikevich(a=0.02, b=0.2, c=-65, d=6, i_offset=[0.014,→˓ 0.0, 0.0]))spike_source = sim.Population(1, sim.SpikeSourceArray(spike_times=arange(10.0, 51,→˓1)))

connection = sim.Projection(spike_source, neurons[1:2], sim.OneToOneConnector(),sim.StaticSynapse(weight=3.0, delay=1.0),receptor_type='excitatory'),

electrode = sim.DCSource(start=2.0, stop=92.0, amplitude=0.014)electrode.inject_into(neurons[2:3])

neurons.record(['v']) # , 'u'])neurons.initialize(v=-70.0, u=-14.0)

# === Run the simulation =====================================================

sim.run(100.0)

# === Save the results, optionally plot a figure =============================

filename = normalized_filename("Results", "Izhikevich", "pkl",options.simulator, sim.num_processes())

neurons.write_data(filename, annotations={'script_name': __file__})

if options.plot_figure:from pyNN.utility.plotting import Figure, Panelfigure_filename = filename.replace("pkl", "png")data = neurons.get_data().segments[0]v = data.filter(name="v")[0]#u = data.filter(name="u")[0]Figure(

Panel(v, ylabel="Membrane potential (mV)", xticks=True,xlabel="Time (ms)", yticks=True),

#Panel(u, ylabel="u variable (units?)"),(continues on next page)

64 Chapter 14. Examples

Page 69: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

(continued from previous page)

annotations="Simulated with %s" % options.simulator.upper()).save(figure_filename)print(figure_filename)

# === Clean up and quit ========================================================

sim.end()

14.2 Injecting time-varying current into a cell

"""Injecting time-varying current into a cell.

There are four "standard" current sources in PyNN:

- DCSource- ACSource- StepCurrentSource- NoisyCurrentSource

Any other current waveforms can be implemented using StepCurrentSource.

Usage: current_injection.py [-h] [--plot-figure] simulator

positional arguments:simulator neuron, nest, brian or another backend simulator

optional arguments:-h, --help show this help message and exit--plot-figure Plot the simulation results to a file

(continues on next page)

14.2. Injecting time-varying current into a cell 65

Page 70: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

(continued from previous page)

"""

from pyNN.utility import get_simulator, normalized_filename

# === Configure the simulator ================================================

sim, options = get_simulator(("--plot-figure", "Plot the simulation results to a file→˓",

{"action": "store_true"}))sim.setup()

# === Create four cells and inject current into each one =====================

cells = sim.Population(4, sim.IF_curr_exp(v_thresh=-55.0, tau_refrac=5.0, tau_m=10.0))

current_sources = [sim.DCSource(amplitude=0.5, start=50.0, stop=400.0),sim.StepCurrentSource(times=[50.0, 210.0, 250.0, 410.0],

amplitudes=[0.4, 0.6, -0.2, 0.2]),sim.ACSource(start=50.0, stop=450.0, amplitude=0.2,

offset=0.1, frequency=10.0, phase=180.0),sim.NoisyCurrentSource(mean=0.5, stdev=0.2, start=50.0,

stop=450.0, dt=1.0)]

for cell, current_source in zip(cells, current_sources):cell.inject(current_source)

filename = normalized_filename("Results", "current_injection", "pkl", options.→˓simulator)sim.record('v', cells, filename, annotations={'script_name': __file__})

# === Run the simulation =====================================================

sim.run(500.0)

# === Save the results, optionally plot a figure =============================

vm = cells.get_data().segments[0].filter(name="v")[0]sim.end()

if options.plot_figure:from pyNN.utility.plotting import Figure, Panelfrom quantities import mVfigure_filename = filename.replace("pkl", "png")Figure(

Panel(vm, y_offset=-10 * mV, xticks=True, yticks=True,xlabel="Time (ms)", ylabel="Membrane potential (mV)",ylim=(-96, -59)),

title="Current injection example",annotations="Simulated with %s" % options.simulator.upper()

).save(figure_filename)

66 Chapter 14. Examples

Page 71: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

14.2. Injecting time-varying current into a cell 67

Page 72: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

14.3 A demonstration of the responses of different standard neuronmodels to current injection

68 Chapter 14. Examples

Page 73: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

"""A demonstration of the responses of different standard neuron models to current→˓injection.

Usage: python cell_type_demonstration.py [-h] [--plot-figure] [--debug] simulator

positional arguments:simulator neuron, nest, brian or another backend simulator

optional arguments:-h, --help show this help message and exit--plot-figure Plot the simulation results to a file.--debug Print debugging information

"""

from pyNN.utility import get_simulator, init_logging, normalized_filename

# === Configure the simulator ================================================

sim, options = get_simulator(("--plot-figure", "Plot the simulation results to a file.→˓", {"action": "store_true"}),

("--debug", "Print debugging information"))

if options.debug:init_logging(None, debug=True)

sim.setup(timestep=0.01, min_delay=1.0)

# === Build and instrument the network =======================================

cuba_exp = sim.Population(1, sim.IF_curr_exp(i_offset=1.0), label="IF_curr_exp")hh = sim.Population(1, sim.HH_cond_exp(i_offset=0.2), label="HH_cond_exp")adexp = sim.Population(1, sim.EIF_cond_exp_isfa_ista(i_offset=1.0), label="EIF_cond_→˓exp_isfa_ista")adapt = sim.Population(1, sim.IF_cond_exp_gsfa_grr(i_offset=2.0), label="IF_cond_exp_→˓gsfa_grr")izh = sim.Population(1, sim.Izhikevich(i_offset=0.01), label="Izhikevich")

all_neurons = cuba_exp + hh + adexp + adapt + izh

all_neurons.record('v')adexp.record('w')izh.record('u')

# === Run the simulation =====================================================

sim.run(100.0)

# === Save the results, optionally plot a figure =============================

filename = normalized_filename("Results", "cell_type_demonstration", "pkl", options.→˓simulator)

(continues on next page)

14.3. A demonstration of the responses of different standard neuron models to current injection69

Page 74: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

(continued from previous page)

all_neurons.write_data(filename, annotations={'script_name': __file__})

if options.plot_figure:from pyNN.utility.plotting import Figure, Panelfigure_filename = filename.replace("pkl", "png")Figure(

Panel(cuba_exp.get_data().segments[0].filter(name='v')[0],ylabel="Membrane potential (mV)",data_labels=[cuba_exp.label], yticks=True, ylim=(-66, -48)),

Panel(hh.get_data().segments[0].filter(name='v')[0],ylabel="Membrane potential (mV)",data_labels=[hh.label], yticks=True, ylim=(-100, 60)),

Panel(adexp.get_data().segments[0].filter(name='v')[0],ylabel="Membrane potential (mV)",data_labels=[adexp.label], yticks=True, ylim=(-75, -40)),

Panel(adexp.get_data().segments[0].filter(name='w')[0],ylabel="w (nA)",data_labels=[adexp.label], yticks=True, ylim=(0, 0.4)),

Panel(adapt.get_data().segments[0].filter(name='v')[0],ylabel="Membrane potential (mV)",data_labels=[adapt.label], yticks=True, ylim=(-75, -45)),

Panel(izh.get_data().segments[0].filter(name='v')[0],ylabel="Membrane potential (mV)",data_labels=[izh.label], yticks=True, ylim=(-80, 40)),

Panel(izh.get_data().segments[0].filter(name='u')[0],xticks=True, xlabel="Time (ms)",ylabel="u (mV/ms)",data_labels=[izh.label], yticks=True, ylim=(-14, 0)),

title="Responses of standard neuron models to current injection",annotations="Simulated with %s" % options.simulator.upper()

).save(figure_filename)print(figure_filename)

# === Clean up and quit ========================================================

sim.end()

70 Chapter 14. Examples

Page 75: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

14.4 An example to illustrate random number handling in PyNN

"""An example to illustrate random number handling in PyNN

In particular, this shows the difference between "native" and Python random number→˓generators.If you run this script with two different simulators, e.g. NEST and NEURON,the weight matrix created with the Python RNG will be the same for both simulations,the weights created with the native RNG will be different in the two cases.

The potential advantage of using a native RNG is speed: for large networks, usingthe `NativeRNG` class can reduce network construction time, but at the expense ofcross-simulator repeatability.

Usage: random_numbers.py [-h] [--plot-figure] [--debug DEBUG] simulator

positional arguments:simulator neuron, nest, brian or another backend simulator

optional arguments:(continues on next page)

14.4. An example to illustrate random number handling in PyNN 71

Page 76: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

(continued from previous page)

-h, --help show this help message and exit--plot-figure plot the simulation results to a file--debug DEBUG print debugging information

"""

import numpyfrom pyNN.random import NumpyRNG, RandomDistributionfrom pyNN.utility import get_simulator

# === Configure the simulator ================================================

sim, options = get_simulator(("--plot-figure", "plot the simulation results to a file", {

→˓"action": "store_true"}),("--debug", "print debugging information"))

sim.setup()

# === Create random number generators ========================================

python_rng = NumpyRNG(seed=98497627)native_rng = sim.NativeRNG(seed=87354762)

# === Define the neuron model and initial conditions =========================

cell_type = sim.IF_cond_exp(tau_m=RandomDistribution('normal', (15.0, 2.0),→˓rng=python_rng)) # not possible with NEST to use NativeRNG herev_init = RandomDistribution('uniform',

(cell_type.default_parameters['v_rest'], cell_type.→˓default_parameters['v_thresh']),

rng=python_rng) # not possible with NEST to use→˓NativeRNG here

# === Create populations of neurons, and record from them ====================

p1 = sim.Population(10, sim.SpikeSourcePoisson(rate=100.0)) # in the current version,→˓ can't specify the RNG - it is always nativep2 = sim.Population(10, cell_type, initial_values={'v': v_init})

p1.record("spikes")p2.record("spikes")p2.sample(3, rng=python_rng).record("v") # can't use native RNG here

# === Create two sets of synaptic connections, one for each RNG ==============

connector_native = sim.FixedProbabilityConnector(p_connect=0.7, rng=native_rng)connector_python = sim.FixedProbabilityConnector(p_connect=0.7, rng=python_rng)

synapse_type_native = sim.StaticSynapse(weight=RandomDistribution('gamma', k=2.0,→˓theta=0.5, rng=native_rng),

delay=0.5)synapse_type_python = sim.StaticSynapse(weight=RandomDistribution('gamma', k=2.0,→˓theta=0.5, rng=python_rng),

delay=0.5)(continues on next page)

72 Chapter 14. Examples

Page 77: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

(continued from previous page)

projection_native = sim.Projection(p1, p2, connector_native, synapse_type_native)projection_python = sim.Projection(p1, p2, connector_python, synapse_type_python)

# === Print the synaptic weight matrices =====================================

weights_python = projection_python.get("weight", format="array")weights_native = projection_native.get("weight", format="array")print(weights_python)print(weights_native)

# === Run the simulation =====================================================

sim.run(100.0)

sim.end()

# === Optionally, plot the synaptic weight matrices ==========================

if options.plot_figure:from pyNN.utility import normalized_filenamefrom pyNN.utility.plotting import Figure, Panelfilename = normalized_filename("Results", "random_numbers", "png", options.

→˓simulator)# where there is no connection, the weight matrix contains NaN# for plotting purposes, we replace NaN with zero.weights_python[numpy.isnan(weights_python)] = 0weights_native[numpy.isnan(weights_native)] = 0Figure(

Panel(weights_python, cmap='gray_r', xlabel="Python RNG"),Panel(weights_native, cmap='gray_r', xlabel="Native RNG"),annotations="Simulated with %s" % options.simulator.upper()

).save(filename)print(filename)

14.4. An example to illustrate random number handling in PyNN 73

Page 78: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

14.5 Illustration of the different standard random distributions anddifferent random number generators

"""Illustration of the different standard random distributions and different random→˓number generators

"""

import numpyimport matplotlib.pyplot as pltimport matplotlib.gridspec as gridspecimport scipy.statsimport pyNN.random as random

try:from neuron import h

except ImportError:have_nrn = False

else:have_nrn = Truefrom pyNN.neuron.random import NativeRNG

(continues on next page)

74 Chapter 14. Examples

Page 79: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

(continued from previous page)

n = 100000nbins = 100

rnglist = [random.NumpyRNG(seed=984527)]if random.have_gsl:

rnglist.append(random.GSLRNG(seed=668454))if have_nrn:

rnglist.append(NativeRNG(seed=321245))

cases = (("uniform", {"low": -65, "high": -55}, (-65, -55), scipy.stats.uniform(loc=-65,

→˓scale=10)),("gamma", {"k": 2.0, "theta": 0.5}, (0, 5), scipy.stats.gamma(2.0, loc=0.0,

→˓scale=0.5)),("normal", {"mu": -1.0, "sigma": 0.5}, (-3, 1), scipy.stats.norm(loc=-1, scale=0.

→˓5)),("exponential", {'beta': 10.0}, (0, 50), scipy.stats.expon(loc=0, scale=10)),("normal_clipped", {"mu": 0.5, "sigma": 0.5, "low": 0, "high": 10}, (-0.5, 3.0),

→˓None),)

fig = plt.figure(1)rows = len(cases)cols = len(rnglist)

settings = {'lines.linewidth': 0.5,'axes.linewidth': 0.5,'axes.labelsize': 'small','axes.titlesize': 'small','legend.fontsize': 'small','font.size': 8,'savefig.dpi': 150,

}plt.rcParams.update(settings)width, height = (2 * cols, 2 * rows)fig = plt.figure(1, figsize=(width, height))gs = gridspec.GridSpec(rows, cols)gs.update(hspace=0.4)

for i, case in enumerate(cases):distribution, parameters, xlim, rv = casebins = numpy.linspace(*xlim, num=nbins)for j, rng in enumerate(rnglist):

rd = random.RandomDistribution(distribution, rng=rng, **parameters)values = rd.next(n)assert values.size == nplt.subplot(gs[i, j])counts, bins, _ = plt.hist(values, bins, range=xlim)plt.title("%s.%s%s" % (rng, distribution, parameters.values()))if rv is not None:

pdf = rv.pdf(bins)scaled_pdf = n * pdf / pdf.sum()plt.plot(bins, scaled_pdf, 'r-')plt.ylim(0, 1.2 * scaled_pdf.max())

(continues on next page)

14.5. Illustration of the different standard random distributions and different random numbergenerators

75

Page 80: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

(continued from previous page)

plt.xlim(xlim)

plt.savefig("Results/random_distributions.png")

76 Chapter 14. Examples

Page 81: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

14.5. Illustration of the different standard random distributions and different random numbergenerators

77

Page 82: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

14.6 A very simple example of using STDP

78 Chapter 14. Examples

Page 83: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

# encoding: utf8"""A very simple example of using STDP.

A single postsynaptic neuron fires at a constant rate. We connect severalpresynaptic neurons to it, each of which fires spikes with a fixed timelag or time advance with respect to the postsynaptic neuron.The weights of these connections are very small, so they will notsignificantly affect the firing times of the post-synaptic neuron.We plot the amount of potentiation or depression of each synapse as afunction of the time difference.

Usage: python simple_STDP.py [-h] [--plot-figure] [--debug DEBUG] simulator

positional arguments:simulator neuron, nest, brian or another backend simulator

optional arguments:-h, --help show this help message and exit--plot-figure Plot the simulation results to a file--fit-curve Calculate the best-fit curve to the weight-delta_t measurements--debug DEBUG Print debugging information

"""

from __future__ import divisionfrom math import expimport numpyimport neofrom quantities import msfrom pyNN.utility import get_simulator, init_logging, normalized_filenamefrom pyNN.utility.plotting import DataTablefrom pyNN.parameters import Sequence

# === Parameters ============================================================

firing_period = 100.0 # (ms) interval between spikescell_parameters = {

"tau_m": 10.0, # (ms)"v_thresh": -50.0, # (mV)"v_reset": -60.0, # (mV)"v_rest": -60.0, # (mV)"cm": 1.0, # (nF)"tau_refrac": firing_period / 2, # (ms) long refractory period to prevent

→˓bursting}n = 60 # number of synapses / number of presynaptic neuronsdelta_t = 1.0 # (ms) time difference between the firing times of→˓neighbouring neuronst_stop = 10 * firing_period + n * delta_tdelay = 3.0 # (ms) synaptic time delay

# === Configure the simulator ===============================================

(continues on next page)

14.6. A very simple example of using STDP 79

Page 84: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

(continued from previous page)

sim, options = get_simulator(("--plot-figure", "Plot the simulation results to a file→˓", {"action": "store_true"}),

("--fit-curve", "Calculate the best-fit curve to the→˓weight-delta_t measurements", {"action": "store_true"}),

("--dendritic-delay-fraction", "What fraction of the→˓total transmission delay is due to dendritic propagation", {"default": 1}),

("--debug", "Print debugging information"))

if options.debug:init_logging(None, debug=True)

sim.setup(timestep=0.01, min_delay=delay, max_delay=delay)

# === Build the network =====================================================

def build_spike_sequences(period, duration, n, delta_t):"""Return a spike time generator for `n` neurons (spike sources), whereall neurons fire with the same period, but neighbouring neurons have a relativefiring time difference of `delta_t`."""def spike_time_gen(i):

"""Spike time generator. `i` should be an array of indices."""return [Sequence(numpy.arange(period + j * delta_t, duration, period)) for j

→˓in (i - n // 2)]return spike_time_gen

spike_sequence_generator = build_spike_sequences(firing_period, t_stop, n, delta_t)# presynaptic populationp1 = sim.Population(n, sim.SpikeSourceArray(spike_times=spike_sequence_generator),

label="presynaptic")# single postsynaptic neuronp2 = sim.Population(1, sim.IF_cond_exp(**cell_parameters),

initial_values={"v": cell_parameters["v_reset"]}, label=→˓"postsynaptic")# drive to the postsynaptic neuron, ensuring it fires at exact multiples of the→˓firing periodp3 = sim.Population(1, sim.SpikeSourceArray(spike_times=numpy.arange(firing_period -→˓delay, t_stop, firing_period)),

label="driver")

# we set the initial weights to be very small, to avoid perturbing the firing times→˓of the# postsynaptic neuronsstdp_model = sim.STDPMechanism(

timing_dependence=sim.SpikePairRule(tau_plus=20.0, tau_minus=20.0,A_plus=0.01, A_minus=0.012),

weight_dependence=sim.AdditiveWeightDependence(w_min=0, w_max=0.→˓0000001),

weight=0.00000005,delay=delay,dendritic_delay_fraction=float(options.dendritic_delay_fraction))

connections = sim.Projection(p1, p2, sim.AllToAllConnector(), stdp_model)

# the connection weight from the driver neuron is very strong, to ensure the# postsynaptic neuron fires at the correct times

(continues on next page)

80 Chapter 14. Examples

Page 85: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

(continued from previous page)

driver_connection = sim.Projection(p3, p2, sim.OneToOneConnector(),sim.StaticSynapse(weight=10.0, delay=delay))

# == Instrument the network =================================================

p1.record('spikes')p2.record(['spikes', 'v'])

class WeightRecorder(object):"""Recording of weights is not yet built in to PyNN, so therefore we needto construct a callback object, which reads the current weights fromthe projection at regular intervals."""

def __init__(self, sampling_interval, projection):self.interval = sampling_intervalself.projection = projectionself._weights = []

def __call__(self, t):self._weights.append(self.projection.get('weight', format='list', with_

→˓address=False))return t + self.interval

def get_weights(self):signal = neo.AnalogSignal(self._weights, units='nA', sampling_period=self.

→˓interval * ms,name="weight")

signal.channel_index = neo.ChannelIndex(numpy.arange(len(self._weights[0])))return signal

weight_recorder = WeightRecorder(sampling_interval=1.0, projection=connections)

# === Run the simulation =====================================================

sim.run(t_stop, callbacks=[weight_recorder])

# === Save the results, optionally plot a figure =============================

filename = normalized_filename("Results", "simple_stdp", "pkl", options.simulator)p2.write_data(filename, annotations={'script_name': __file__})

presynaptic_data = p1.get_data().segments[0]postsynaptic_data = p2.get_data().segments[0]print("Post-synaptic spike times: %s" % postsynaptic_data.spiketrains[0])

weights = weight_recorder.get_weights()final_weights = numpy.array(weights[-1])deltas = delta_t * numpy.arange(n // 2, -n // 2, -1)print("Final weights: %s" % final_weights)plasticity_data = DataTable(deltas, final_weights)

if options.fit_curve:(continues on next page)

14.6. A very simple example of using STDP 81

Page 86: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

(continued from previous page)

def double_exponential(t, t0, w0, wp, wn, tau):return w0 + numpy.where(t >= t0, wp * numpy.exp(-(t - t0) / tau), wn * numpy.

→˓exp((t - t0) / tau))p0 = (-1.0, 5e-8, 1e-8, -1.2e-8, 20.0)popt, pcov = plasticity_data.fit_curve(double_exponential, p0, ftol=1e-10)print("Best fit parameters: t0={0}, w0={1}, wp={2}, wn={3}, tau={4}".

→˓format(*popt))

if options.plot_figure:from pyNN.utility.plotting import Figure, Panel, DataTablefigure_filename = filename.replace("pkl", "png")Figure(

# raster plot of the presynaptic neuron spike timesPanel(presynaptic_data.spiketrains,

yticks=True, markersize=0.2, xlim=(0, t_stop)),# membrane potential of the postsynaptic neuronPanel(postsynaptic_data.filter(name='v')[0],

ylabel="Membrane potential (mV)",data_labels=[p2.label], yticks=True, xlim=(0, t_stop)),

# evolution of the synaptic weights with timePanel(weights, xticks=True, yticks=True, xlabel="Time (ms)",

legend=False, xlim=(0, t_stop)),# scatterplot of the final weight of each synapse against the relative# timing of pre- and postsynaptic spikes for that synapsePanel(plasticity_data,

xticks=True, yticks=True, xlim=(-n / 2 * delta_t, n / 2 * delta_t),ylim=(0.9 * final_weights.min(), 1.1 * final_weights.max()),xlabel="t_post - t_pre (ms)", ylabel="Final weight (nA)",show_fit=options.fit_curve),

title="Simple STDP example",annotations="Simulated with %s" % options.simulator.upper()

).save(figure_filename)print(figure_filename)

# === Clean up and quit ========================================================

sim.end()

82 Chapter 14. Examples

Page 87: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

14.7 Small network created with the Population and Projectionclasses

# encoding: utf-8"""Small network created with the Population and Projection classes

(continues on next page)

14.7. Small network created with the Population and Projection classes 83

Page 88: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

(continued from previous page)

Usage: small_network.py [-h] [--plot-figure] [--debug DEBUG] simulator

positional arguments:simulator neuron, nest, brian or another backend simulator

optional arguments:-h, --help show this help message and exit--plot-figure plot the simulation results to a file--debug DEBUG print debugging information

"""

import numpyfrom pyNN.utility import get_simulator, init_logging, normalized_filenamefrom pyNN.parameters import Sequencefrom pyNN.random import RandomDistribution as rnd

sim, options = get_simulator(("--plot-figure", "Plot the simulation results to a file.→˓", {"action": "store_true"}),

("--debug", "Print debugging information"))

if options.debug:init_logging(None, debug=True)

# === Define parameters ========================================================

n = 20 # Number of cellsw = 0.002 # synaptic weight (µS)cell_params = {

'tau_m' : 20.0, # (ms)'tau_syn_E' : 2.0, # (ms)'tau_syn_I' : 4.0, # (ms)'e_rev_E' : 0.0, # (mV)'e_rev_I' : -70.0, # (mV)'tau_refrac' : 2.0, # (ms)'v_rest' : -60.0, # (mV)'v_reset' : -70.0, # (mV)'v_thresh' : -50.0, # (mV)'cm' : 0.5} # (nF)

dt = 0.1 # (ms)syn_delay = 1.0 # (ms)input_rate = 50.0 # (Hz)simtime = 1000.0 # (ms)

# === Build the network ========================================================

sim.setup(timestep=dt, max_delay=syn_delay)

cells = sim.Population(n, sim.IF_cond_alpha(**cell_params),initial_values={'v': rnd('uniform', (-60.0, -50.0))},label="cells")

number = int(2 * simtime * input_rate / 1000.0)numpy.random.seed(26278342)

(continues on next page)

84 Chapter 14. Examples

Page 89: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

(continued from previous page)

def generate_spike_times(i):gen = lambda: Sequence(numpy.add.accumulate(numpy.random.exponential(1000.0 /

→˓input_rate, size=number)))if hasattr(i, "__len__"):

return [gen() for j in i]else:

return gen()assert generate_spike_times(0).max() > simtime

spike_source = sim.Population(n, sim.SpikeSourceArray(spike_times=generate_spike_→˓times))

spike_source.record('spikes')cells.record('spikes')cells[0:2].record(('v', 'gsyn_exc'))

syn = sim.StaticSynapse(weight=w, delay=syn_delay)input_conns = sim.Projection(spike_source, cells, sim.FixedProbabilityConnector(0.5),→˓syn)

# === Run simulation ===========================================================

sim.run(simtime)

filename = normalized_filename("Results", "small_network", "pkl",options.simulator, sim.num_processes())

cells.write_data(filename, annotations={'script_name': __file__})

print("Mean firing rate: ", cells.mean_spike_count() * 1000.0 / simtime, "Hz")

if options.plot_figure:from pyNN.utility.plotting import Figure, Panelfigure_filename = filename.replace("pkl", "png")data = cells.get_data().segments[0]vm = data.filter(name="v")[0]gsyn = data.filter(name="gsyn_exc")[0]Figure(

Panel(vm, ylabel="Membrane potential (mV)"),Panel(gsyn, ylabel="Synaptic conductance (uS)"),Panel(data.spiketrains, xlabel="Time (ms)", xticks=True),annotations="Simulated with %s" % options.simulator.upper()

).save(figure_filename)print(figure_filename)

# === Clean up and quit ========================================================

sim.end()

14.7. Small network created with the Population and Projection classes 85

Page 90: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

86 Chapter 14. Examples

Page 91: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

14.8 A demonstration of the responses of different standard neuronmodels to synaptic input

14.8. A demonstration of the responses of different standard neuron models to synaptic input 87

Page 92: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

"""A demonstration of the responses of different standard neuron models to synaptic→˓input.

This should show that for the current-based synapses, the size of the excitatorypost-synaptic potential (EPSP) is constant, whereas for the conductance-basedsynapses it depends on the value of the membrane potential.

Usage: python synaptic_input.py [-h] [--plot-figure] [--debug] simulator

positional arguments:simulator neuron, nest, brian or another backend simulator

optional arguments:-h, --help show this help message and exit--plot-figure Plot the simulation results to a file.--debug Print debugging information

"""

from quantities import msfrom pyNN.utility import get_simulator, init_logging, normalized_filename

# === Configure the simulator ================================================

sim, options = get_simulator(("--plot-figure", "Plot the simulation results to a file.→˓", {"action": "store_true"}),

("--debug", "Print debugging information"))

if options.debug:init_logging(None, debug=True)

sim.setup(timestep=0.01, min_delay=1.0)

# === Build and instrument the network =======================================

# for each cell type we create two neurons, one of which we depolarize with# injected current

cuba_exp = sim.Population(2, sim.IF_curr_exp(tau_m=10.0, i_offset=[0.0, 1.0]),initial_values={"v": [-65, -55]}, label="Exponential,

→˓current-based")cuba_alpha = sim.Population(2, sim.IF_curr_alpha(tau_m=10.0, i_offset=[0.0, 1.0]),

initial_values={"v": [-65, -55]}, label="Alpha, current-→˓based")coba_exp = sim.Population(2, sim.IF_cond_exp(tau_m=10.0, i_offset=[0.0, 1.0]),

initial_values={"v": [-65, -55]}, label="Exponential,→˓conductance-based")coba_alpha = sim.Population(2, sim.IF_cond_alpha(tau_m=10.0, i_offset=[0.0, 1.0]),

initial_values={"v": [-65, -55]}, label="Alpha,→˓conductance-based")v_step = sim.Population(2, sim.Izhikevich(i_offset=[0.0, 0.002]),

initial_values={"v": [-70, -67], "u": [-14, -13.4]}, label=→˓"Izhikevich")

(continues on next page)

88 Chapter 14. Examples

Page 93: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

(continued from previous page)

# we next create a spike source, which will emit spikes at the specified times

spike_times = [25, 50, 80, 90]stimulus = sim.Population(1, sim.SpikeSourceArray(spike_times=spike_times), label=→˓"Input spikes")

# now we connect the spike source to each of the neuron populations, with differing→˓synaptic weights

all_neurons = cuba_exp + cuba_alpha + coba_exp + coba_alpha + v_step

connections = [sim.Projection(stimulus, population,connector=sim.AllToAllConnector(),synapse_type=sim.StaticSynapse(weight=w, delay=2.0),receptor_type="excitatory")

for population, w in zip(all_neurons.populations, [1.6, 4.0, 0.03, 0.→˓12, 1.0])]

# finally, we set up recording of the membrane potential

all_neurons.record('v')

# === Run the simulation =====================================================

sim.run(100.0)

# === Calculate the height of the first EPSP =================================

print("Height of first EPSP:")for population in all_neurons.populations:

# retrieve the recorded datavm = population.get_data().segments[0].filter(name='v')[0]# take the data between the first and second incoming spikesvm12 = vm.time_slice(spike_times[0] * ms, spike_times[1] * ms)# calculate and print the EPSP heightfor channel in (0, 1):

v_init = vm12[:, channel][0]height = vm12[:, channel].max() - v_initprint(" {:<30} at {}: {}".format(population.label, v_init, height))

# === Save the results, optionally plot a figure =============================

filename = normalized_filename("Results", "synaptic_input", "pkl", options.simulator)all_neurons.write_data(filename, annotations={'script_name': __file__})

if options.plot_figure:from pyNN.utility.plotting import Figure, Panelfigure_filename = filename.replace("pkl", "png")Figure(

Panel(cuba_exp.get_data().segments[0].filter(name='v')[0],ylabel="Membrane potential (mV)",data_labels=[cuba_exp.label], yticks=True, ylim=(-66, -50)),

Panel(cuba_alpha.get_data().segments[0].filter(name='v')[0],data_labels=[cuba_alpha.label], yticks=True, ylim=(-66, -50)),

(continues on next page)

14.8. A demonstration of the responses of different standard neuron models to synaptic input 89

Page 94: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

(continued from previous page)

Panel(coba_exp.get_data().segments[0].filter(name='v')[0],data_labels=[coba_exp.label], yticks=True, ylim=(-66, -50)),

Panel(coba_alpha.get_data().segments[0].filter(name='v')[0],data_labels=[coba_alpha.label], yticks=True, ylim=(-66, -50)),

Panel(v_step.get_data().segments[0].filter(name='v')[0],xticks=True, xlabel="Time (ms)",data_labels=[v_step.label], yticks=True, ylim=(-71, -65)),

title="Responses of standard neuron models to synaptic input",annotations="Simulated with %s" % options.simulator.upper()

).save(figure_filename)print(figure_filename)

# === Clean up and quit ========================================================

sim.end()

90 Chapter 14. Examples

Page 95: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

14.8. A demonstration of the responses of different standard neuron models to synaptic input 91

Page 96: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

14.9 Example of depressing and facilitating synapses

92 Chapter 14. Examples

Page 97: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

# encoding: utf-8"""Example of depressing and facilitating synapses

Usage: tsodyksmarkram.py [-h] [--plot-figure] [--debug DEBUG] simulator

positional arguments:simulator neuron, nest, brian or another backend simulator

optional arguments:-h, --help show this help message and exit--plot-figure Plot the simulation results to a file.--debug DEBUG Print debugging information

"""

import numpyfrom pyNN.utility import get_simulator, init_logging, normalized_filename

# === Configure the simulator ================================================

sim, options = get_simulator(("--plot-figure", "Plot the simulation results to a file.→˓", {"action": "store_true"}),

("--debug", "Print debugging information"))

if options.debug:init_logging(None, debug=True)

sim.setup(quit_on_end=False)

# === Build and instrument the network =======================================

spike_source = sim.Population(1, sim.SpikeSourceArray(spike_times=numpy.arange(10,→˓100, 10)))

connector = sim.AllToAllConnector()

synapse_types = {'static': sim.StaticSynapse(weight=0.01, delay=0.5),'depressing': sim.TsodyksMarkramSynapse(U=0.5, tau_rec=800.0, tau_facil=0.0,

weight=0.01, delay=0.5),'facilitating': sim.TsodyksMarkramSynapse(U=0.04, tau_rec=100.0,

tau_facil=1000.0, weight=0.01,delay=0.5),

}

populations = {}projections = {}for label in 'static', 'depressing', 'facilitating':

populations[label] = sim.Population(3, sim.IF_cond_exp(e_rev_I=-75, tau_syn_I=[1.→˓2, 6.7, 4.3]), label=label)

populations[label].record(['v', 'gsyn_inh'])projections[label] = sim.Projection(spike_source, populations[label], connector,

receptor_type='inhibitory',synapse_type=synapse_types[label])

(continues on next page)

14.9. Example of depressing and facilitating synapses 93

Page 98: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

(continued from previous page)

spike_source.record('spikes')

# === Run the simulation =====================================================

sim.run(200.0)

# === Save the results, optionally plot a figure =============================

for label, p in populations.items():filename = normalized_filename("Results", "tsodyksmarkram_%s" % label,

"pkl", options.simulator)p.write_data(filename, annotations={'script_name': __file__})

if options.plot_figure:from pyNN.utility.plotting import Figure, Panelfigure_filename = normalized_filename("Results", "tsodyksmarkram",

"png", options.simulator)panels = []for variable in ('gsyn_inh', 'v'):

for population in populations.values():panels.append(

Panel(population.get_data().segments[0].filter(name=variable)[0],data_labels=[population.label], yticks=True),

)# add ylabel to top panel in each grouppanels[0].options.update(ylabel=u'Synaptic conductance (µS)')panels[3].options.update(ylabel='Membrane potential (mV)')# add xticks and xlabel to final panelpanels[-1].options.update(xticks=True, xlabel="Time (ms)")

Figure(*panels,title="Example of static, facilitating and depressing synapses",annotations="Simulated with %s" % options.simulator.upper()

).save(figure_filename)print(figure_filename)

# === Clean up and quit ========================================================

sim.end()

94 Chapter 14. Examples

Page 99: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

14.10 A demonstration of the use of callbacks to vary the rate of aSpikeSourcePoisson

"""A demonstration of the use of callbacks to vary the rate of a SpikeSourcePoisson.

Every 200 ms, the Poisson firing rate is increased by 20 spikes/s

Usage: varying_poisson.py [-h] [--plot-figure] simulator

positional arguments:simulator neuron, nest, brian or another backend simulator

optional arguments:-h, --help show this help message and exit--plot-figure Plot the simulation results to a file.

"""

import numpy as npfrom pyNN.utility import get_simulator, normalized_filename, ProgressBarfrom pyNN.utility.plotting import Figure, Panel

sim, options = get_simulator(("--plot-figure", "Plot the simulation results to a file.→˓",

{"action": "store_true"}))

rate_increment = 20interval = 200

class SetRate(object):"""A callback which changes the firing rate of a population of poisson

(continues on next page)

14.10. A demonstration of the use of callbacks to vary the rate of a SpikeSourcePoisson 95

Page 100: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

(continued from previous page)

processes at a fixed interval."""

def __init__(self, population, rate_generator, interval=20.0):assert isinstance(population.celltype, sim.SpikeSourcePoisson)self.population = populationself.interval = intervalself.rate_generator = rate_generator

def __call__(self, t):try:self.population.set(rate=next(rate_generator))

except StopIteration:pass

return t + self.interval

class MyProgressBar(object):"""A callback which draws a progress bar in the terminal."""

def __init__(self, interval, t_stop):self.interval = intervalself.t_stop = t_stopself.pb = ProgressBar(width=int(t_stop / interval), char=".")

def __call__(self, t):self.pb(t / self.t_stop)return t + self.interval

sim.setup()

# === Create a population of poisson processes ===============================

p = sim.Population(50, sim.SpikeSourcePoisson())p.record('spikes')

# === Run the simulation, with two callback functions ========================

rate_generator = iter(range(0, 100, rate_increment))sim.run(1000, callbacks=[MyProgressBar(10.0, 1000.0),

SetRate(p, rate_generator, interval)])

# === Retrieve recorded data, and count the spikes in each interval ==========

data = p.get_data().segments[0]

all_spikes = np.hstack([st.magnitude for st in data.spiketrains])spike_counts = [((all_spikes >= x) & (all_spikes < x + interval)).sum()

for x in range(0, 1000, interval)]expected_spike_counts = [p.size * rate * interval / 1000.0

for rate in range(0, 100, rate_increment)](continues on next page)

96 Chapter 14. Examples

Page 101: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

(continued from previous page)

print("\nActual spike counts: {}".format(spike_counts))print("Expected mean spike counts: {}".format(expected_spike_counts))

if options.plot_figure:Figure(

Panel(data.spiketrains, xlabel="Time (ms)", xticks=True),title="Time varying Poisson spike trains",annotations="Simulated with %s" % options.simulator.upper()

).save(normalized_filename("Results", "varying_poisson", "png", options.→˓simulator))

sim.end()

14.10. A demonstration of the use of callbacks to vary the rate of a SpikeSourcePoisson 97

Page 102: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

98 Chapter 14. Examples

Page 103: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

14.11 Example of simple stochastic synapses

14.11. Example of simple stochastic synapses 99

Page 104: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

# encoding: utf-8"""Example of simple stochastic synapses

"""

import matplotlibmatplotlib.use('Agg')import numpyfrom pyNN.utility import get_simulator, init_logging, normalized_filename

# === Configure the simulator ================================================

sim, options = get_simulator(("--plot-figure", "Plot the simulation results to a file.→˓", {"action": "store_true"}),

("--debug", "Print debugging information"))

if options.debug:init_logging(None, debug=True)

sim.setup(quit_on_end=False)

# === Build and instrument the network =======================================

spike_source = sim.Population(1, sim.SpikeSourceArray(spike_times=numpy.arange(10,→˓100, 10)))

connector = sim.AllToAllConnector()

synapse_types = {'static': sim.StaticSynapse(weight=0.01, delay=0.5),'stochastic': sim.SimpleStochasticSynapse(p=0.5, weight=0.02, delay=0.5)

}

populations = {}projections = {}for label in 'static', 'stochastic':

populations[label] = sim.Population(1, sim.IF_cond_exp(), label=label)populations[label].record(['v', 'gsyn_inh'])projections[label] = sim.Projection(spike_source, populations[label], connector,

receptor_type='inhibitory',synapse_type=synapse_types[label])

spike_source.record('spikes')

# === Run the simulation =====================================================

sim.run(200.0)

# === Save the results, optionally plot a figure =============================

for label, p in populations.items():filename = normalized_filename("Results", "stochastic_synapses_%s" % label,

(continues on next page)

100 Chapter 14. Examples

Page 105: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

(continued from previous page)

"pkl", options.simulator)p.write_data(filename, annotations={'script_name': __file__})

if options.plot_figure:from pyNN.utility.plotting import Figure, Panelfigure_filename = normalized_filename("Results", "stochastic_synapses_",

"png", options.simulator)panels = []for variable in ('gsyn_inh', 'v'):

for population in populations.values():panels.append(

Panel(population.get_data().segments[0].filter(name=variable)[0],data_labels=[population.label], yticks=True),

)# add ylabel to top panel in each grouppanels[0].options.update(ylabel=u'Synaptic conductance (µS)')panels[3].options.update(ylabel='Membrane potential (mV)')# add xticks and xlabel to final panelpanels[-1].options.update(xticks=True, xlabel="Time (ms)")

Figure(*panels,title="Example of simple stochastic synapses",annotations="Simulated with %s" % options.simulator.upper()

).save(figure_filename)print(figure_filename)

# === Clean up and quit ========================================================

sim.end()

14.11. Example of simple stochastic synapses 101

Page 106: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

102 Chapter 14. Examples

Page 107: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

14.12 Example of facilitating and depressing synapses in determin-istic and stochastic versions

14.12. Example of facilitating and depressing synapses in deterministic and stochastic versions103

Page 108: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

# encoding: utf-8"""Example of facilitating and depressing synapses in deterministic and stochastic→˓versions

"""

import matplotlibmatplotlib.use('Agg')import numpy as npimport neofrom pyNN.utility import get_simulator, init_logging, normalized_filename

# === Configure the simulator ================================================

sim, options = get_simulator(("--plot-figure", "Plot the simulation results to a file.→˓", {"action": "store_true"}),

("--debug", "Print debugging information"))

if options.debug:init_logging(None, debug=True)

sim.setup(quit_on_end=False)

# === Build and instrument the network =======================================

spike_times = np.hstack((np.arange(10, 100, 10), np.arange(250, 350, 10)))spike_source = sim.Population(1, sim.SpikeSourceArray(spike_times=spike_times))

connector = sim.AllToAllConnector()

depressing = dict(U=0.8, tau_rec=100.0, tau_facil=0.0, weight=0.01, delay=0.5)facilitating = dict(U=0.04, tau_rec=50.0, tau_facil=200.0, weight=0.01, delay=0.5)

synapse_types = {'depressing, deterministic': sim.TsodyksMarkramSynapse(**depressing),'depressing, stochastic': sim.StochasticTsodyksMarkramSynapse(**depressing),'facilitating, deterministic': sim.TsodyksMarkramSynapse(**facilitating),'facilitating, stochastic': sim.

→˓StochasticTsodyksMarkramSynapse(**facilitating),}

populations = {}projections = {}for label in synapse_types:

populations[label] = sim.Population(1000, sim.IF_cond_exp(e_rev_I=-75, tau_syn_→˓I=4.3), label=label)

populations[label].record('gsyn_inh')projections[label] = sim.Projection(spike_source, populations[label], connector,

receptor_type='inhibitory',synapse_type=synapse_types[label])

spike_source.record('spikes')

(continues on next page)

104 Chapter 14. Examples

Page 109: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

(continued from previous page)

# === Run the simulation =====================================================

sim.run(400.0)

# === Save the results, optionally plot a figure =============================

for label, p in populations.items():filename = normalized_filename("Results", "stochastic_comparison_%s" % label,

"pkl", options.simulator)p.write_data(filename, annotations={'script_name': __file__})

if options.plot_figure:from pyNN.utility.plotting import Figure, Panel#figure_filename = normalized_filename("Results", "stochastic_comparison",# "png", options.simulator)figure_filename = "Results/stochastic_comparison_{}.png".format(options.simulator)

data = {}for label in synapse_types:

data[label] = populations[label].get_data().segments[0]if 'stochastic' in label:

gsyn = data[label].filter(name='gsyn_inh')[0]gsyn_mean = neo.AnalogSignal(gsyn.mean(axis=1).reshape(-1, 1),

sampling_rate=gsyn.sampling_rate)gsyn_mean.channel_index = neo.ChannelIndex(np.array([0]))gsyn_mean.name = 'gsyn_inh_mean'data[label].analogsignals.append(gsyn_mean)

#import pdb; pdb.set_trace()

def make_panel(population, label):return Panel(population.get_data().segments[0].filter(name='gsyn_inh')[0],

data_labels=[label], yticks=True)panels = [

Panel(data['depressing, deterministic'].filter(name='gsyn_inh')[0][:, 0],data_labels=['depressing, deterministic'], yticks=True,ylim=[0, 0.008]),

Panel(data['depressing, stochastic'].filter(name='gsyn_inh_mean')[0],data_labels=['depressing, stochastic mean'], yticks=True,ylim=[0, 0.008]),

Panel(data['facilitating, deterministic'].filter(name='gsyn_inh')[0][:, 0],data_labels=['facilitating, deterministic'], yticks=True,ylim=[0, 0.002]),

Panel(data['facilitating, stochastic'].filter(name='gsyn_inh_mean')[0],data_labels=['facilitating, stochastic mean'], yticks=True,ylim=[0, 0.002]),

]# add ylabel to top panel in each grouppanels[0].options.update(ylabel=u'Synaptic conductance (µS)')##panels[len(synapse_types)].options.update(ylabel='Membrane potential (mV)')# add xticks and xlabel to final panelpanels[-1].options.update(xticks=True, xlabel="Time (ms)")

Figure(*panels,title="Example of facilitating and depressing synapses in deterministic

→˓and stochastic versions", (continues on next page)

14.12. Example of facilitating and depressing synapses in deterministic and stochastic versions105

Page 110: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

(continued from previous page)

annotations="Simulated with %s" % options.simulator.upper()).save(figure_filename)print(figure_filename)

# === Clean up and quit ========================================================

sim.end()

14.13 Balanced network of excitatory and inhibitory neurons

# coding: utf-8"""Balanced network of excitatory and inhibitory neurons.

An implementation of benchmarks 1 and 2 from

Brette et al. (2007) Journal of Computational Neuroscience 23: 349-398

The network is based on the CUBA and COBA models of Vogels & Abbott(J. Neurosci, 2005). The model consists of a network of excitatory andinhibitory neurons, connected via current-based "exponential"synapses (instantaneous rise, exponential decay).

Usage: python VAbenchmarks.py [-h] [--plot-figure] [--use-views] [--use-assembly][--use-csa] [--debug DEBUG]simulator benchmark

positional arguments:simulator neuron, nest, brian or another backend simulatorbenchmark either CUBA or COBA

optional arguments:-h, --help show this help message and exit--plot-figure plot the simulation results to a file--use-views use population views in creating the network--use-assembly use assemblies in creating the network

(continues on next page)

106 Chapter 14. Examples

Page 111: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

(continued from previous page)

--use-csa use the Connection Set Algebra to define the connectivity--debug DEBUG print debugging information

Andrew Davison, UNIC, CNRSAugust 2006

"""

import socketfrom math import *from pyNN.utility import get_simulator, Timer, ProgressBar, init_logging, normalized_→˓filenamefrom pyNN.random import NumpyRNG, RandomDistribution

# === Configure the simulator ================================================

sim, options = get_simulator(("benchmark", "either CUBA or COBA"),("--plot-figure", "plot the simulation results to a file", {

→˓"action": "store_true"}),("--use-views", "use population views in creating the network", {

→˓"action": "store_true"}),("--use-assembly", "use assemblies in creating the network", {

→˓"action": "store_true"}),("--use-csa", "use the Connection Set Algebra to define the

→˓connectivity", {"action": "store_true"}),("--debug", "print debugging information"))

if options.use_csa:import csa

if options.debug:init_logging(None, debug=True)

timer = Timer()

# === Define parameters ========================================================

threads = 1rngseed = 98765parallel_safe = True

n = 4000 # number of cellsr_ei = 4.0 # number of excitatory cells:number of inhibitory cellspconn = 0.02 # connection probabilitystim_dur = 50. # (ms) duration of random stimulationrate = 100. # (Hz) frequency of the random stimulation

dt = 0.1 # (ms) simulation timesteptstop = 1000 # (ms) simulaton durationdelay = 0.2

# Cell parametersarea = 20000. # (µm2)tau_m = 20. # (ms)

(continues on next page)

14.13. Balanced network of excitatory and inhibitory neurons 107

Page 112: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

(continued from previous page)

cm = 1. # (µF/cm2)g_leak = 5e-5 # (S/cm2)if options.benchmark == "COBA":

E_leak = -60. # (mV)elif options.benchmark == "CUBA":

E_leak = -49. # (mV)v_thresh = -50. # (mV)v_reset = -60. # (mV)t_refrac = 5. # (ms) (clamped at v_reset)v_mean = -60. # (mV) 'mean' membrane potential, for calculating CUBA weightstau_exc = 5. # (ms)tau_inh = 10. # (ms)

# Synapse parametersif options.benchmark == "COBA":

Gexc = 4. # (nS)Ginh = 51. # (nS)

elif options.benchmark == "CUBA":Gexc = 0.27 # (nS) #Those weights should be similar to the COBA weightsGinh = 4.5 # (nS) # but the delpolarising drift should be taken into account

Erev_exc = 0. # (mV)Erev_inh = -80. # (mV)

### what is the synaptic delay???

# === Calculate derived parameters =============================================

area = area*1e-8 # convert to cm2

cm = cm*area*1000 # convert to nFRm = 1e-6/(g_leak*area) # membrane resistance in MΩassert tau_m == cm*Rm # just to checkn_exc = int(round((n*r_ei/(1+r_ei)))) # number of excitatory cellsn_inh = n - n_exc # number of inhibitory cellsif options.benchmark == "COBA":

celltype = sim.IF_cond_expw_exc = Gexc*1e-3 # We convert conductances to uSw_inh = Ginh*1e-3

elif options.benchmark == "CUBA":celltype = sim.IF_curr_expw_exc = 1e-3*Gexc*(Erev_exc - v_mean) # (nA) weight of excitatory synapsesw_inh = 1e-3*Ginh*(Erev_inh - v_mean) # (nA)assert w_exc > 0; assert w_inh < 0

# === Build the network ========================================================

extra = {'threads' : threads,'filename': "va_%s.xml" % options.benchmark,'label': 'VA'}

if options.simulator == "neuroml":extra["file"] = "VAbenchmarks.xml"

node_id = sim.setup(timestep=dt, min_delay=delay, max_delay=1.0, **extra)np = sim.num_processes()

host_name = socket.gethostname()print("Host #%d is on %s" % (node_id + 1, host_name))

(continues on next page)

108 Chapter 14. Examples

Page 113: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

(continued from previous page)

print("%s Initialising the simulator with %d thread(s)..." % (node_id, extra['threads→˓']))

cell_params = {'tau_m' : tau_m, 'tau_syn_E' : tau_exc, 'tau_syn_I' : tau_inh,'v_rest' : E_leak, 'v_reset' : v_reset, 'v_thresh' : v_thresh,'cm' : cm, 'tau_refrac' : t_refrac}

if (options.benchmark == "COBA"):cell_params['e_rev_E'] = Erev_exccell_params['e_rev_I'] = Erev_inh

timer.start()

print("%s Creating cell populations..." % node_id)if options.use_views:

# create a single population of neurons, and then use population views to define# excitatory and inhibitory sub-populationsall_cells = sim.Population(n_exc + n_inh, celltype(**cell_params), label="All

→˓Cells")exc_cells = all_cells[:n_exc]exc_cells.label = "Excitatory cells"inh_cells = all_cells[n_exc:]inh_cells.label = "Inhibitory cells"

else:# create separate populations for excitatory and inhibitory neuronsexc_cells = sim.Population(n_exc, celltype(**cell_params), label="Excitatory_Cells

→˓")inh_cells = sim.Population(n_inh, celltype(**cell_params), label="Inhibitory_Cells

→˓")if options.use_assembly:

# group the populations into an assemblyall_cells = exc_cells + inh_cells

if options.benchmark == "COBA":ext_stim = sim.Population(20, sim.SpikeSourcePoisson(rate=rate, duration=stim_

→˓dur), label="expoisson")rconn = 0.01ext_conn = sim.FixedProbabilityConnector(rconn)ext_syn = sim.StaticSynapse(weight=0.1)

print("%s Initialising membrane potential to random values..." % node_id)rng = NumpyRNG(seed=rngseed, parallel_safe=parallel_safe)uniformDistr = RandomDistribution('uniform', low=v_reset, high=v_thresh, rng=rng)if options.use_views:

all_cells.initialize(v=uniformDistr)else:

exc_cells.initialize(v=uniformDistr)inh_cells.initialize(v=uniformDistr)

print("%s Connecting populations..." % node_id)progress_bar = ProgressBar(width=20)if options.use_csa:

connector = sim.CSAConnector(csa.cset(csa.random(pconn)))else:

connector = sim.FixedProbabilityConnector(pconn, rng=rng, callback=progress_bar)exc_syn = sim.StaticSynapse(weight=w_exc, delay=delay)

(continues on next page)

14.13. Balanced network of excitatory and inhibitory neurons 109

Page 114: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

(continued from previous page)

inh_syn = sim.StaticSynapse(weight=w_inh, delay=delay)

connections = {}if options.use_views or options.use_assembly:

connections['exc'] = sim.Projection(exc_cells, all_cells, connector, exc_syn,→˓receptor_type='excitatory')

connections['inh'] = sim.Projection(inh_cells, all_cells, connector, inh_syn,→˓receptor_type='inhibitory')

if (options.benchmark == "COBA"):connections['ext'] = sim.Projection(ext_stim, all_cells, ext_conn, ext_syn,

→˓receptor_type='excitatory')else:

connections['e2e'] = sim.Projection(exc_cells, exc_cells, connector, exc_syn,→˓receptor_type='excitatory')

connections['e2i'] = sim.Projection(exc_cells, inh_cells, connector, exc_syn,→˓receptor_type='excitatory')

connections['i2e'] = sim.Projection(inh_cells, exc_cells, connector, inh_syn,→˓receptor_type='inhibitory')

connections['i2i'] = sim.Projection(inh_cells, inh_cells, connector, inh_syn,→˓receptor_type='inhibitory')

if (options.benchmark == "COBA"):connections['ext2e'] = sim.Projection(ext_stim, exc_cells, ext_conn, ext_syn,

→˓receptor_type='excitatory')connections['ext2i'] = sim.Projection(ext_stim, inh_cells, ext_conn, ext_syn,

→˓receptor_type='excitatory')

# === Setup recording ==========================================================print("%s Setting up recording..." % node_id)if options.use_views or options.use_assembly:

all_cells.record('spikes')exc_cells[[0, 1]].record('v')

else:exc_cells.record('spikes')inh_cells.record('spikes')exc_cells[0, 1].record('v')

buildCPUTime = timer.diff()

# === Save connections to file =================================================

#for prj in connections.keys():#connections[prj].saveConnections('Results/VAbenchmark_%s_%s_%s_np%d.conn' %

→˓(benchmark, prj, options.simulator, np))saveCPUTime = timer.diff()

# === Run simulation ===========================================================

print("%d Running simulation..." % node_id)

sim.run(tstop)

simCPUTime = timer.diff()

E_count = exc_cells.mean_spike_count()I_count = inh_cells.mean_spike_count()

# === Print results to file ====================================================(continues on next page)

110 Chapter 14. Examples

Page 115: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

(continued from previous page)

print("%d Writing data to file..." % node_id)

filename = normalized_filename("Results", "VAbenchmarks_%s_exc" % options.benchmark,→˓"pkl",

options.simulator, np)exc_cells.write_data(filename,

annotations={'script_name': __file__})inh_cells.write_data(filename.replace("exc", "inh"),

annotations={'script_name': __file__})

writeCPUTime = timer.diff()

if options.use_views or options.use_assembly:connections = "%d e→e,i %d i→e,i" % (connections['exc'].size(),

connections['inh'].size())else:

connections = u"%d e→e %d e→i %d i→e %d i→i" % (connections['e2e'].size(),connections['e2i'].size(),connections['i2e'].size(),connections['i2i'].size())

if node_id == 0:print("\n--- Vogels-Abbott Network Simulation ---")print("Nodes : %d" % np)print("Simulation type : %s" % options.benchmark)print("Number of Neurons : %d" % n)print("Number of Synapses : %s" % connections)print("Excitatory conductance : %g nS" % Gexc)print("Inhibitory conductance : %g nS" % Ginh)print("Excitatory rate : %g Hz" % (E_count * 1000.0 / tstop,))print("Inhibitory rate : %g Hz" % (I_count * 1000.0 / tstop,))print("Build time : %g s" % buildCPUTime)#print("Save connections time : %g s" % saveCPUTime)print("Simulation time : %g s" % simCPUTime)print("Writing time : %g s" % writeCPUTime)

# === Finished with simulator ==================================================

sim.end()

14.13. Balanced network of excitatory and inhibitory neurons 111

Page 116: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

112 Chapter 14. Examples

Page 117: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

CHAPTER 15

Publications about, relating to or using PyNN

• Schmuker, Michael, Pfeil, Thomas and Nawrot, Martin Paul (2014) A neuromorphic network for genericmultivariate data classification Proceedings of the National Academy of Sciences 111: 2081-2086. doi:10.1073/pnas.1303053111 [link]

• Kaplan, BA, Khoei, MA, Lansner, A, & Perrinet, LU (2014) Signature of an anticipatory response in areaVI as modeled by a probabilistic model and a spiking neural network. In: Neural Networks (IJCNN), 2014International Joint Conference on (pp. 3205-3212). Beijing, China. IEEE. doi: 10.1109/IJCNN.2014.6889847[link]

• Djurfeldt M., Davison A.P. and Eppler J.M. (2014) Efficient generation of connectivity in neuronal net-works from simulator-independent descriptions. Frontiers in Neuroinformatics 8:43: doi: 10.3389/fn-inf.2014.00043 [link]

• Antolík J. and Davison A.P. (2013) Integrated workflows for spiking neuronal network simulations. Fron-tiers in Neuroinformatics 7:34: 10.3389/fninf.2013.00034 [link]

• Pfeil, Thomas, Grübl, Andreas, Jeltsch, Sebastian, Müller, Eric, Müller, Paul, Petrovici, Mihai A., Schmuker,Michael, Brüderle, Daniel, Schemmel, Johannes and Meier, Karlheinz (2013) Six networks on a universalneuromorphic computing substrate. Frontiers in Neuroscience 7:11 doi: 10.3389/fnins.2013.00011 [link]

• Kaplan BA, Lansner A, Masson GS and Perrinet LU (2013) Anisotropic connectivity implements motion-based prediction in a spiking neural network. Front. Comput. Neurosci. 7:112. doi: 10.3389/fn-com.2013.00112 ‘[link] <>‘_

• Brüderle D., Petrovici M.A., Vogginger B., Ehrlich M., Pfeil T., Millner S., Grübl A., Wendt K., Müller E.,Schwartz M.O., Husmann de Oliveira D., Jeltsch S., Fieres J., Schilling M., Müller P., Breitwieser O., PetkovV., Muller L., Davison A.P., Krishnamurthy P., Kremkow J., Lundqvist M., Muller E., Partzsch J., ScholzeS., Zühl L., Mayr C., Destexhe A., Diesmann M., Potjans T.C., Lansner A., Schüffny R., Schemmel J., MeierK. (2011) A Comprehensive Workflow for General-Purpose Neural Modeling with Highly ConfigurableNeuromorphic Hardware Systems. Biological Cybernetics 104: 263-296. doi: 10.1007/s00422-011-0435-9[link]

• Galluppi, Francesco, Rast, Alexander, Davies, Sergio and Furber, Steve (2010) A general-purpose modeltranslation system for a universal neural chip. Neural Information Processing. Theory and Algorithms;Lecture Notes in Computer Science vol 6443, pp58-65 [link]

113

Page 118: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

• J. Nageswaran, N. Dutt, J. L. Krichmar, A. Nicolau, A. V. Veidenbaum (2009) A configurable simulationenvironment for the efficient simulation of large-scale spiking neural networks on graphics processors.Neural Networks 22:5-6, doi:10.1016/j.neunet.2009.06.028. [link]

• Davison AP, Hines M and Muller E (2009) Trends in programming languages for neuroscience simulations.Front. Neurosci. doi:10.3389/neuro.01.036.2009. [link]

• Davison AP, Brüderle D, Eppler J, Kremkow J, Muller E, Pecevski D, Perrinet L and Yger P(2009) PyNN: a common interface for neuronal network simulators. Front. Neuroinform. 2:11.doi:10.3389/neuro.11.011.2008. [link]

• Brüderle D, Muller E, Davison A, Muller E, Schemmel J and Meier K (2009) Establishing a novel mod-eling tool: a python-based interface for a neuromorphic hardware system. Front. Neuroinform. 3:17.doi:10.3389/neuro.11.017.2009. [link]

• Bednar JA (2009) Topographica: building and analyzing map-level simulations from Python, C/C++,MATLAB, NEST, or NEURON components. Front. Neuroinform. 3:8. doi:10.3389/neuro.11.008.2009.[link]

• Goodman D and Brette R (2008) Brian: a simulator for spiking neural networks in Python. Front. Neuroin-form. 2:5. doi:10.3389/neuro.11.005.2008. [link]

• Pecevski D, Natschläger T and Schuch K (2009) PCSIM: a parallel simulation environment for neuralcircuits fully integrated with Python. Front. Neuroinform. 3:11. doi:10.3389/neuro.11.011.2009. [link]

• Ray S and Bhalla US (2008) PyMOOSE: interoperable scripting in Python for MOOSE. Front. Neuroin-form. 2:6. doi:10.3389/neuro.11.006.2008. [link]

• Sharon Crook, R Angus Silver and Padraig Gleeson (2009) Describing and exchanging models of neuronsand neuronal networks with NeuroML. BMC Neuroscience, 10(Suppl 1):L1doi:10.1186/1471-2202-10-S1-L1. [link]

• D. Brüderle, A. Grübl, K. Meier, E. Muller and J. Schemmel (2007) A Software Framework for Tuning theDynamics of Neuromorphic Silicon Towards Biology. LNCS 4507. doi:10.1007/978-3-540-73007-1. [link]

• B. Kaplan, D. Brüderle, J. Schemmel and K. Meier (2009) High-Conductance States on a NeuromorphicHardware System. Proceedings of IJCNN 2009. [link]

• D. Brüderle (2009) Neuroscientific Modeling with a Mixed-Signal VLSI Hardware System. Doctoral Dis-sertation, Kirchhoff-Institute for Physics, University of Heidelberg. [link]

• A. Davison, P. Yger, J. Kremkow, L. Perrinet and E. Muller (2007) PyNN: towards a universal neural simu-lator API in Python. BMC Neuroscience 2007, 8(Suppl 2):P2. doi:10.1186/1471-2202-8-S2-P2. [link]

• E. Muller, A. P. Davison, T. Brizzi, D. Bruederle, M. J. Eppler, J. Kremkow, D. Pecevski, L. Perrinet, M.Schmuker and P. Yger (2009) NeuralEnsemble.Org: Unifying neural simulators in Python to ease themodel complexity bottleneck. Frontiers in Neuroinformatics Conference Abstract: Neuroinformatics 2009.doi: 10.3389/conf.neuro.11.2009.08.104. [link]

114 Chapter 15. Publications about, relating to or using PyNN

Page 119: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

CHAPTER 16

Contributors and licence

The following people have contributed to PyNN. Their affiliations at the time of the contributions are shown below.

• Andrew Davison [1]

• Pierre Yger [1, 9]

• Eilif Muller [7, 13]

• Jens Kremkow [5,6]

• Daniel Brüderle [2]

• Laurent Perrinet [6]

• Jochen Eppler [3,4]

• Dejan Pecevski [8]

• Nicolas Debeissat [10]

• Mikael Djurfeldt [12, 15]

• Michael Schmuker [11]

• Bernhard Kaplan [2]

• Thomas Natschlaeger [8]

• Subhasis Ray [14]

• Yury Zaytsev [16]

• Jan Antolik [1]

• Alexandre Gravier

• Thomas Close [17]

• Oliver Breitwieser [2]

• Jannis Schücker [16]

• Maximilian Schmidt [16]

115

Page 120: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

• Christian Roessert [13]

• Shailesh Appukuttan [1]

• Elodie Legouée [1]

• Joffrey Gonin [1]

• Ankur Sinha [18]

1. Unité de Neuroscience, Information et Complexité, CNRS, Gif sur Yvette, France

2. Kirchhoff Institute for Physics, University of Heidelberg, Heidelberg, Germany

3. Honda Research Institute Europe GmbH, Offenbach, Germany

4. Bernstein Center for Computational Neuroscience, Albert-Ludwigs-University, Freiburg, Germany

5. Neurobiology and Biophysics, Institute of Biology III, Albert-Ludwigs-University, Freiburg, Germany

6. Institut de Neurosciences Cognitives de la Méditerranée, CNRS, Marseille, France

7. Laboratory of Computational Neuroscience, Ecole Polytechnique Fédérale de Lausanne, Lausanne, Switzerland

8. Institute for Theoretical Computer Science, Graz University of Technology, Graz, Austria

9. Department of Bioengineering, Imperial College London

10. INRIA, Sophia Antipolis, France

11. Neuroinformatics & Theoretical Neuroscience, Freie Universität Berlin, Berlin, Germany

12. International Neuroinformatics Coordinating Facility, Stockholm, Sweden

13. Blue Brain Project, Ecole Polytechnique Fédérale de Lausanne, Lausanne, Switzerland

14. NCBS, Bangalore, India

15. PDC, KTH, Stockholm, Sweden

16. Institute of Neuroscience and Medicine (INM-6), Jülich Research Center, Jülich, Germany

17. Okinawa Institute of Science and Technology (OIST), Onna-son, Okinawa, Japan

18. Biocomputation group, University of Hertfordshire, Hatfield, United Kingdom.

16.1 Licence

PyNN is freely available under the CeCILL v2 license, which is equivalent to, and compatible with, the GNU GPLlicense, but conforms to French law (and is also perfectly suited to international projects) - see http://www.cecill.info/index.en.html for more information. The choice of GPL-equivalence was made to match the licenses of otherwidely-used simulation software in computational neuroscience, such as NEURON (GPL) and Brian (CeCILL).

If you are interested in using PyNN, but the choice of licence is a problem for you, please contact us to discussdual-licensing.

LICENSE AGREEMENT

CeCILL FREE SOFTWARE LICENSE AGREEMENT

Notice

This Agreement is a Free Software license agreement that is the result of discussions between its authors in order toensure compliance with the two main principles guiding its drafting:

116 Chapter 16. Contributors and licence

Page 121: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

• firstly, compliance with the principles governing the distribution of Free Software: access to source code, broadrights granted to users,

• secondly, the election of a governing law, French law, with which it is conformant, both as regards the law oftorts and intellectual property law, and the protection that it offers to both authors and holders of the economicrights over software.

The authors of the CeCILL (for Ce[a] C[nrs] I[nria] L[ogiciel] L[ibre]) license are:

Commissariat à l’Energie Atomique - CEA, a public scientific, technical and industrial research establishment, havingits principal place of business at 25 rue Leblanc, immeuble Le Ponant D, 75015 Paris, France.

Centre National de la Recherche Scientifique - CNRS, a public scientific and technological establishment, having itsprincipal place of business at 3 rue Michel-Ange, 75794 Paris cedex 16, France.

Institut National de Recherche en Informatique et en Automatique - INRIA, a public scientific and technologicalestablishment, having its principal place of business at Domaine de Voluceau, Rocquencourt, BP 105, 78153 LeChesnay cedex, France.

Preamble

The purpose of this Free Software license agreement is to grant users the right to modify and redistribute the softwaregoverned by this license within the framework of an open source distribution model.

The exercising of these rights is conditional upon certain obligations for users so as to preserve this status for allsubsequent redistributions.

In consideration of access to the source code and the rights to copy, modify and redistribute granted by the license,users are provided only with a limited warranty and the software’s author, the holder of the economic rights, and thesuccessive licensors only have limited liability.

In this respect, the risks associated with loading, using, modifying and/or developing or reproducing the software bythe user are brought to the user’s attention, given its Free Software status, which may make it complicated to use, withthe result that its use is reserved for developers and experienced professionals having in-depth computer knowledge.Users are therefore encouraged to load and test the suitability of the software as regards their requirements in conditionsenabling the security of their systems and/or data to be ensured and, more generally, to use and operate it in the sameconditions of security. This Agreement may be freely reproduced and published, provided it is not altered, and that noprovisions are either added or removed herefrom.

This Agreement may apply to any or all software for which the holder of the economic rights decides to submit theuse thereof to its provisions.

Article 1 - DEFINITIONS

For the purpose of this Agreement, when the following expressions commence with a capital letter, they shall have thefollowing meaning:

Agreement: means this license agreement, and its possible subsequent versions and annexes.

Software: means the software in its Object Code and/or Source Code form and, where applicable, its documentation,“as is” when the Licensee accepts the Agreement.

Initial Software: means the Software in its Source Code and possibly its Object Code form and, where applicable, itsdocumentation, “as is” when it is first distributed under the terms and conditions of the Agreement.

Modified Software: means the Software modified by at least one Contribution.

Source Code: means all the Software’s instructions and program lines to which access is required so as to modify theSoftware.

Object Code: means the binary files originating from the compilation of the Source Code.

Holder: means the holder(s) of the economic rights over the Initial Software.

16.1. Licence 117

Page 122: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

Licensee: means the Software user(s) having accepted the Agreement.

Contributor: means a Licensee having made at least one Contribution.

Licensor: means the Holder, or any other individual or legal entity, who distributes the Software under the Agreement.

Contribution: means any or all modifications, corrections, translations, adaptations and/or new functions integratedinto the Software by any or all Contributors, as well as any or all Internal Modules.

Module: means a set of sources files including their documentation that enables supplementary functions or servicesin addition to those offered by the Software.

External Module: means any or all Modules, not derived from the Software, so that this Module and the Software runin separate address spaces, with one calling the other when they are run.

Internal Module: means any or all Module, connected to the Software so that they both execute in the same addressspace.

GNU GPL: means the GNU General Public License version 2 or any subsequent version, as published by the FreeSoftware Foundation Inc.

Parties: mean both the Licensee and the Licensor.

These expressions may be used both in singular and plural form.

Article 2 - PURPOSE

The purpose of the Agreement is the grant by the Licensor to the Licensee of a non-exclusive, transferable and world-wide license for the Software as set forth in Article 5 hereinafter for the whole term of the protection granted by therights over said Software.

Article 3 - ACCEPTANCE

3.1 The Licensee shall be deemed as having accepted the terms and conditions of this Agreement upon the occurrenceof the first of the following events:

• (i) loading the Software by any or all means, notably, by downloading from a remote server, or by loading froma physical medium;

• (ii) the first time the Licensee exercises any of the rights granted hereunder.

3.2 One copy of the Agreement, containing a notice relating to the characteristics of the Software, to the limitedwarranty, and to the fact that its use is restricted to experienced users has been provided to the Licensee prior toits acceptance as set forth in Article 3.1 hereinabove, and the Licensee hereby acknowledges that it has read andunderstood it.

Article 4 - EFFECTIVE DATE AND TERM

4.1 EFFECTIVE DATE

The Agreement shall become effective on the date when it is accepted by the Licensee as set forth in Article 3.1.

4.2 TERM

The Agreement shall remain in force for the entire legal term of protection of the economic rights over the Software.

Article 5 - SCOPE OF RIGHTS GRANTED

The Licensor hereby grants to the Licensee, who accepts, the following rights over the Software for any or all use, andfor the term of the Agreement, on the basis of the terms and conditions set forth hereinafter.

Besides, if the Licensor owns or comes to own one or more patents protecting all or part of the functions of the Softwareor of its components, the Licensor undertakes not to enforce the rights granted by these patents against successiveLicensees using, exploiting or modifying the Software. If these patents are transferred, the Licensor undertakes tohave the transferees subscribe to the obligations set forth in this paragraph.

118 Chapter 16. Contributors and licence

Page 123: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

5.1 RIGHT OF USE

The Licensee is authorized to use the Software, without any limitation as to its fields of application, with it beinghereinafter specified that this comprises:

1. permanent or temporary reproduction of all or part of the Software by any or all means and in any or all form.

2. loading, displaying, running, or storing the Software on any or all medium.

3. entitlement to observe, study or test its operation so as to determine the ideas and principles behind any orall constituent elements of said Software. This shall apply when the Licensee carries out any or all loading,displaying, running, transmission or storage operation as regards the Software, that it is entitled to carry outhereunder.

5.2 ENTITLEMENT TO MAKE CONTRIBUTIONS

The right to make Contributions includes the right to translate, adapt, arrange, or make any or all modifications to theSoftware, and the right to reproduce the resulting software.

The Licensee is authorized to make any or all Contributions to the Software provided that it includes an explicit noticethat it is the author of said Contribution and indicates the date of the creation thereof.

5.3 RIGHT OF DISTRIBUTION

In particular, the right of distribution includes the right to publish, transmit and communicate the Software to thegeneral public on any or all medium, and by any or all means, and the right to market, either in consideration of a fee,or free of charge, one or more copies of the Software by any means.

The Licensee is further authorized to distribute copies of the modified or unmodified Software to third parties accordingto the terms and conditions set forth hereinafter.

5.3.1 DISTRIBUTION OF SOFTWARE WITHOUT MODIFICATION

The Licensee is authorized to distribute true copies of the Software in Source Code or Object Code form, providedthat said distribution complies with all the provisions of the Agreement and is accompanied by:

1. a copy of the Agreement,

2. a notice relating to the limitation of both the Licensor’s warranty and liability as set forth in Articles 8 and 9,

and that, in the event that only the Object Code of the Software is redistributed, the Licensee allows future Licenseesunhindered access to the full Source Code of the Software by indicating how to access it, it being understood that theadditional cost of acquiring the Source Code shall not exceed the cost of transferring the data.

5.3.2 DISTRIBUTION OF MODIFIED SOFTWARE

When the Licensee makes a Contribution to the Software, the terms and conditions for the distribution of the resultingModified Software become subject to all the provisions of this Agreement.

The Licensee is authorized to distribute the Modified Software, in source code or object code form, provided that saiddistribution complies with all the provisions of the Agreement and is accompanied by:

1. a copy of the Agreement,

2. a notice relating to the limitation of both the Licensor’s warranty and liability as set forth in Articles 8 and 9,

and that, in the event that only the object code of the Modified Software is redistributed, the Licensee allows futureLicensees unhindered access to the full source code of the Modified Software by indicating how to access it, it beingunderstood that the additional cost of acquiring the source code shall not exceed the cost of transferring the data.

5.3.3 DISTRIBUTION OF EXTERNAL MODULES

When the Licensee has developed an External Module, the terms and conditions of this Agreement do not apply tosaid External Module, that may be distributed under a separate license agreement.

5.3.4 COMPATIBILITY WITH THE GNU GPL

16.1. Licence 119

Page 124: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

The Licensee can include a code that is subject to the provisions of one of the versions of the GNU GPL in the Modifiedor unmodified Software, and distribute that entire code under the terms of the same version of the GNU GPL.

The Licensee can include the Modified or unmodified Software in a code that is subject to the provisions of one of theversions of the GNU GPL, and distribute that entire code under the terms of the same version of the GNU GPL.

Article 6 - INTELLECTUAL PROPERTY

6.1 OVER THE INITIAL SOFTWARE

The Holder owns the economic rights over the Initial Software. Any or all use of the Initial Software is subject tocompliance with the terms and conditions under which the Holder has elected to distribute its work and no one shallbe entitled to modify the terms and conditions for the distribution of said Initial Software.

The Holder undertakes that the Initial Software will remain ruled at least by this Agreement, for the duration set forthin Article 4.2.

6.2 OVER THE CONTRIBUTIONS

The Licensee who develops a Contribution is the owner of the intellectual property rights over this Contribution asdefined by applicable law.

6.3 OVER THE EXTERNAL MODULES

The Licensee who develops an External Module is the owner of the intellectual property rights over this ExternalModule as defined by applicable law and is free to choose the type of agreement that shall govern its distribution.

6.4 JOINT PROVISIONS

The Licensee expressly undertakes:

1. not to remove, or modify, in any manner, the intellectual property notices attached to the Software;

2. to reproduce said notices, in an identical manner, in the copies of the Software modified or not.

The Licensee undertakes not to directly or indirectly infringe the intellectual property rights of the Holder and/orContributors on the Software and to take, where applicable, vis-à-vis its staff, any and all measures required to ensurerespect of said intellectual property rights of the Holder and/or Contributors.

Article 7 - RELATED SERVICES

7.1 Under no circumstances shall the Agreement oblige the Licensor to provide technical assistance or maintenanceservices for the Software.

However, the Licensor is entitled to offer this type of services. The terms and conditions of such technical assistance,and/or such maintenance, shall be set forth in a separate instrument. Only the Licensor offering said maintenanceand/or technical assistance services shall incur liability therefor.

7.2 Similarly, any Licensor is entitled to offer to its licensees, under its sole responsibility, a warranty, that shall onlybe binding upon itself, for the redistribution of the Software and/or the Modified Software, under terms and conditionsthat it is free to decide. Said warranty, and the financial terms and conditions of its application, shall be subject of aseparate instrument executed between the Licensor and the Licensee.

Article 8 - LIABILITY

8.1 Subject to the provisions of Article 8.2, the Licensee shall be entitled to claim compensation for any direct lossit may have suffered from the Software as a result of a fault on the part of the relevant Licensor, subject to providingevidence thereof.

8.2 The Licensor’s liability is limited to the commitments made under this Agreement and shall not be incurredas a result of in particular: (i) loss due the Licensee’s total or partial failure to fulfill its obligations, (ii) direct orconsequential loss that is suffered by the Licensee due to the use or performance of the Software, and (iii) moregenerally, any consequential loss. In particular the Parties expressly agree that any or all pecuniary or business loss(i.e. loss of data, loss of profits, operating loss, loss of customers or orders, opportunity cost, any disturbance to

120 Chapter 16. Contributors and licence

Page 125: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

business activities) or any or all legal proceedings instituted against the Licensee by a third party, shall constituteconsequential loss and shall not provide entitlement to any or all compensation from the Licensor.

Article 9 - WARRANTY

9.1 The Licensee acknowledges that the scientific and technical state-of-the-art when the Software was distributeddid not enable all possible uses to be tested and verified, nor for the presence of possible defects to be detected. Inthis respect, the Licensee’s attention has been drawn to the risks associated with loading, using, modifying and/ordeveloping and reproducing the Software which are reserved for experienced users.

The Licensee shall be responsible for verifying, by any or all means, the suitability of the product for its requirements,its good working order, and for ensuring that it shall not cause damage to either persons or properties.

9.2 The Licensor hereby represents, in good faith, that it is entitled to grant all the rights over the Software (includingin particular the rights set forth in Article 5).

9.3 The Licensee acknowledges that the Software is supplied “as is” by the Licensor without any other express or tacitwarranty, other than that provided for in Article 9.2 and, in particular, without any warranty as to its commercial value,its secured, safe, innovative or relevant nature.

Specifically, the Licensor does not warrant that the Software is free from any error, that it will operate without inter-ruption, that it will be compatible with the Licensee’s own equipment and software configuration, nor that it will meetthe Licensee’s requirements.

9.4 The Licensor does not either expressly or tacitly warrant that the Software does not infringe any third partyintellectual property right relating to a patent, software or any other property right. Therefore, the Licensor disclaimsany and all liability towards the Licensee arising out of any or all proceedings for infringement that may be instituted inrespect of the use, modification and redistribution of the Software. Nevertheless, should such proceedings be institutedagainst the Licensee, the Licensor shall provide it with technical and legal assistance for its defense. Such technicaland legal assistance shall be decided on a case-by-case basis between the relevant Licensor and the Licensee pursuantto a memorandum of understanding. The Licensor disclaims any and all liability as regards the Licensee’s use of thename of the Software. No warranty is given as regards the existence of prior rights over the name of the Software oras regards the existence of a trademark.

Article 10 - TERMINATION

10.1 In the event of a breach by the Licensee of its obligations hereunder, the Licensor may automatically terminatethis Agreement thirty (30) days after notice has been sent to the Licensee and has remained ineffective.

10.2 A Licensee whose Agreement is terminated shall no longer be authorized to use, modify or distribute the Software.However, any licenses that it may have granted prior to termination of the Agreement shall remain valid subject totheir having been granted in compliance with the terms and conditions hereof.

Article 11 - MISCELLANEOUS

11.1 EXCUSABLE EVENTS

Neither Party shall be liable for any or all delay, or failure to perform the Agreement, that may be attributable toan event of force majeure, an act of God or an outside cause, such as defective functioning or interruptions of theelectricity or telecommunications networks, network paralysis following a virus attack, intervention by governmentauthorities, natural disasters, water damage, earthquakes, fire, explosions, strikes and labor unrest, war, etc.

11.2 Any failure by either Party, on one or more occasions, to invoke one or more of the provisions hereof, shallunder no circumstances be interpreted as being a waiver by the interested Party of its right to invoke said provision(s)subsequently.

11.3 The Agreement cancels and replaces any or all previous agreements, whether written or oral, between the Partiesand having the same purpose, and constitutes the entirety of the agreement between said Parties concerning saidpurpose. No supplement or modification to the terms and conditions hereof shall be effective as between the Partiesunless it is made in writing and signed by their duly authorized representatives.

16.1. Licence 121

Page 126: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

11.4 In the event that one or more of the provisions hereof were to conflict with a current or future applicable act orlegislative text, said act or legislative text shall prevail, and the Parties shall make the necessary amendments so as tocomply with said act or legislative text. All other provisions shall remain effective. Similarly, invalidity of a provisionof the Agreement, for any reason whatsoever, shall not cause the Agreement as a whole to be invalid.

11.5 LANGUAGE

The Agreement is drafted in both French and English and both versions are deemed authentic.

Article 12 - NEW VERSIONS OF THE AGREEMENT

12.1 Any person is authorized to duplicate and distribute copies of this Agreement.

12.2 So as to ensure coherence, the wording of this Agreement is protected and may only be modified by the authorsof the License, who reserve the right to periodically publish updates or new versions of the Agreement, each with aseparate number. These subsequent versions may address new issues encountered by Free Software.

12.3 Any Software distributed under a given version of the Agreement may only be subsequently distributed under thesame version of the Agreement or a subsequent version, subject to the provisions of Article 5.3.4.

Article 13 - GOVERNING LAW AND JURISDICTION

13.1 The Agreement is governed by French law. The Parties agree to endeavor to seek an amicable solution to anydisagreements or disputes that may arise during the performance of the Agreement.

13.2 Failing an amicable solution within two (2) months as from their occurrence, and unless emergency proceedingsare necessary, the disagreements or disputes shall be referred to the Paris Courts having jurisdiction, by the morediligent Party.

Version 2.0 dated 2006-09-05.

122 Chapter 16. Contributors and licence

Page 127: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

CHAPTER 17

Release notes

17.1 PyNN 0.9.3 release notes

December 4th 2018

Welcome to PyNN 0.9.3!

17.1.1 NEST 2.16.0

PyNN 0.9.3 now supports the latest version of NEST.

17.1.2 Array-valued parameters

The generalized integrate-and-fire model (GIF_cond_exp) was added in version 0.8.3. This model has multiplemechanisms, each with multiple time constants, e.g. tau_eta1, tau_eta2, tau_eta3. To simplify parameterisation ofsuch models, we now allow array-valued parameters, specified as a tuple, e.g. instead of:

GIF_cond_exp(...tau_eta1=1.0, tau_eta2=10.0, tau_eta3=100.0...

)

we now write:

GIF_cond_exp(...tau_eta=(1.0, 10.0, 100.0)

)

As for other parameter types, we can also specify inhomogeneous values across a population using lists of tuples, orgenerator functions.

123

Page 128: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

17.1.3 Project governance and code of conduct

In an attempt to follow best practices in the governance of open source software projects, we have adopted some rulesand guidelines concerning the rights and obligations of contributors and of maintainers, and of how we decide whowill be a maintainer.

This includes a code of conduct for contributors and maintainers, aimed at fostering an open and welcoming environ-ment.

17.1.4 Simplified use of random number generators

Previously, a random number generator with parallel_safe=False would always draw a reduced number of valueswhen run with >1 MPI processes, according to the number of processes, unless the mask_local parameter was set toFalse.

Now, a mask must be explicitly provided if you want to draw a reduced number of values (i.e. only those valuesconsumed on that node).

If provided, the mask parameter (renamed from mask_local) should be a boolean or integer NumPy array, indicatingthat only a subset of the random numbers should be returned.

Example:

rng.next(5, mask=np.array([True, False, True, False, True]))

or:

rng.next(5, mask=np.array([0, 2, 4]))

will each return only three values.

If the rng is “parallel safe”, an array of n values will be drawn from the rng, and the mask applied. If the rng is notparallel safe, the contents of the mask are disregarded, only its size (for an integer mask) or the number of True values(for a boolean mask) is used in determining how many values to draw.

17.1.5 Support for NEURON “ARTIFICIAL_CELL” models

When using the NEURON simulator through PyNN, it is now possible to use “ARTIFICIAL_CELL” models, such asIntFire1, IntFire2 and IntFire4:

from pyNN.neuron import setup, Population, IntFire1

setup()p1 = Population(10, IntFire1(tau=10.0, refrac=2.5))p1.record('m')

17.1.6 Bug fixes and performance improvements

A number of bugs have been fixed, and some performance optimizations have been made.

17.2 PyNN 0.9.2 release notes

November 22nd 2017

124 Chapter 17. Release notes

Page 129: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

Welcome to PyNN 0.9.2!

17.2.1 Recording injected currents

It is now possible to record the injected current from CurrentSource objects in PyNN, for example:

noise = sim.NoisyCurrentSource(mean=0.5, stdev=0.2, start=50.0, stop=450.0, dt=1.0)noise.record()

sim.run(500.0)

signal = noise.get_data()

The returned signal object is a Neo AnalogSignal.

17.2.2 Python 2.6

As of this version, PyNN no longer supports Python 2.6.

17.2.3 NEST 2.14.0 and NEURON 7.5

PyNN 0.9.1 now supports the latest versions of NEST and NEURON. NEURON 7.4 is also still supported. NEST2.12.0 should still work in most circumstances, but current recording (see above) requires a more recent version.

17.2.4 native_electrode_type

It has been possible for some time to use native (NEST-specific) neuron and synapse models with pyNN.nest. It isnow also possible to use native current generator models, e.g.:

noise = sim.native_electrode_type('noise_generator')(mean=500.0, std=200.0, start=50.→˓0,

stop=450.0, dt=1.0)

17.2.5 Bug fixes

A number of bugs have been fixed.

17.3 PyNN 0.9.1 release notes

May 4th 2017

Welcome to PyNN 0.9.1!

17.3.1 Stochastic synapses

This release adds three new standard synapse models, available for the NEST and NEURON simulators. They are:

• SimpleStochasticSynapse - each spike is transmitted with a probability p.

17.3. PyNN 0.9.1 release notes 125

Page 130: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

• StochasticTsodyksMarkramSynapse - synapse exhibiting facilitation and depression, implemented using the modelof Tsodyks, Markram et al., in its stochastic version.

• MultiQuantalSynapse - synapse exhibiting facilitation and depression with multiple quantal release sites.

There are some new example scripts which demonstrate use of the synapse models - see Example of simple stochasticsynapses and Example of facilitating and depressing synapses in deterministic and stochastic versions.

Note that the new models require building a NEST extension; this is done automatically during installation (whenrunning pip install or setup.py install).

17.3.2 Bug fixes

A number of bugs have been fixed.

17.4 PyNN 0.9.0 release notes

April 12th 2017

Welcome to PyNN 0.9.0!

This version of PyNN adopts the new, simplified Neo object model, first released as Neo 0.5.0, for the data structuresreturned by Population.get_data(). For more information on the new Neo API, see the release notes.

The main difference for a PyNN user is that the AnalogSignalArray class has been renamed to AnalogSignal,and similarly the Segment.analogsignalarrays attribute is now called Segment.analogsignals.

In addition, a number of bugs with current injection for the pyNN.brian module have been fixed.

17.5 PyNN 0.8.3 release notes

8th March 2017

Welcome to PyNN 0.8.3!

17.5.1 NeuroML 2

The neuroml module has been completely rewritten, and updated from NeuroML v1 to v2. This module works likeother PyNN “backends”, i.e. import pyNN.neuroml as sim. . . but instead of running a simulation, it exportsthe network to an XML file in NeuroML format.

17.5.2 NEST 2.12

This release introduces support for NEST 2.12. Previous versions of NEST are no longer supported.

17.5.3 Other changes

• A couple of bug fixes

126 Chapter 17. Release notes

Page 131: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

17.6 PyNN 0.8.2 release notes

6th December 2016

Welcome to PyNN 0.8.2!

17.6.1 New spike sources

Two new spike source models were added, with implementations for the NEST and NEURON backends:SpikeSourceGamma (spikes follow a gamma process) and SpikeSourcePoissonRefractory (inter-spikeintervals are drawn from an exponential distribution as for a Poisson process, but there is a fixed refractory period aftereach spike during which no spike can occur).

17.6.2 Other changes

• Changed the save_positions() format from id x y z to index x y z to make it simulator inde-pendent.

• Added histograms to the utility.plotting module.

• Added a multiple_synapses flag to Projection.get(..., format="array")() to control howsynaptic parameters are combined when there are multiple connections between pairs of neurons. Until now, pa-rameters were summed, which made sense for weights but not for delays. We have adopted the Brian approachof adding an argument multiple_synapses which is one of {'last', 'first', 'sum', 'min','max'}. The default is sum.

• Assorted bug fixes

17.7 PyNN 0.8.1 release notes

25th May 2016

Welcome to PyNN 0.8.1!

17.7.1 NEST 2.10

This release introduces support for NEST 2.10. Previous versions of NEST are no longer supported.

17.7.2 Other changes

• Assorted bug fixes

17.8 PyNN 0.8.0 release notes

October 5th 2015

Welcome to the final release of PyNN 0.8.0!

For PyNN 0.8 we have taken the opportunity to make significant, backward-incompatible changes to the API. The aimwas fourfold:

17.6. PyNN 0.8.2 release notes 127

Page 132: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

• to simplify the API, making it more consistent and easier to remember;

• to make the API more powerful, so more complex models can be expressed with less code;

• to allow a number of internal simplifications so it is easier for new developers to contribute;

• to prepare for planned future extensions, notably support for multi-compartmental models.

We summarize here the main changes between versions 0.7 and 0.8 of the API.

17.8.1 Creating populations

In previous versions of PyNN, the Population constructor was called with the population size, a BaseCellTypesub-class such as IF_cond_exp and a dictionary of parameter values. For example:

p = Population(1000, IF_cond_exp, {'tau_m': 12.0, 'cm': 0.8}) # PyNN 0.7

This dictionary was passed to the cell-type class constructor within the Population constructor to create a cell-typeinstance.

The reason for doing this was that in early versions of PyNN, use of native NEST models was supported by passing astring, the model name, as the cell-type argument. Since PyNN 0.7, however, native models have been supported withthe NativeCellType class, and passing a string is no longer allowed.

It makes more sense, therefore, for the cell-type instance to be created by the user, and to pass a cell-type instance,rather than a cell-type class, to the Population constructor.

There is also a second change: specification of parameters for cell-type classes is now done via keyword argumentsrather than a single parameter dictionary. This is for consistency with current sources and synaptic plasticity models,which already use keyword arguments.

The example above should be rewritten as:

p = Population(1000, IF_cond_exp(tau_m=12.0, cm=0.8)) # PyNN 0.8

or:

p = Population(1000, IF_cond_exp(**{'tau_m': 12.0, 'cm': 0.8})) # PyNN 0.8

or:

cell_type = IF_cond_exp(tau_m=12.0, cm=0.8) # PyNN 0.8p = Population(1000, cell_type)

The first form, with a separate parameter dictionary, is still supported for the time being, but is deprecated and may beremoved in future versions.

17.8.2 Specifying heterogeneous parameter values

In previous versions of PyNN, the Population constructor supported setting parameters to either homogeneousvalues (all cells in the population have the same value) or random values. After construction, it was possible to changeparameters using the Population.set(), Population.tset() (for topographic set - parameters were set byusing an array of the same size as the population) and Population.rset() (for random set) methods.

In PyNN 0.8, setting parameters is simpler and more consistent, in that both when constructing a cell type for use inthe Population constructor (see above) and in the Population.set() method, parameter values can be any ofthe following:

128 Chapter 17. Release notes

Page 133: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

• a single number - sets the same value for all cells in the Population;

• a RandomDistribution object - for each cell, sets a different random value drawn from the distribution;

• a list or 1D NumPy array of the same size as the Population;

• a function that takes a single integer argument; this function will be called with the index of every cell in thePopulation to return the parameter value for that cell.

See Model parameters and initial values for more details and examples.

The call signature of the Population.set() method has also been changed; now parameters should be specifiedas keyword arguments. For example, instead of:

p.set({"tau_m": 20.0}) # PyNN 0.7

use:

p.set(tau_m=20.0) # PyNN 0.8

and instead of:

p.set({"tau_m": 20.0, "v_rest": -65}) # PyNN 0.7

use:

p.set(tau_m=20.0, v_rest=-65) # PyNN 0.8

Now that Population.set() accepts random distributions and arrays as arguments, the Population.tset()and Population.rset() methods are superfluous. As of version 0.8, their use is deprecated and they will beremoved in the next version of PyNN. Their use can be replaced as follows:

p.tset("i_offset", arr) # PyNN 0.7p.set(i_offset=arr) # PyNN 0.8

p.rset("tau_m", rand_distr) # PyNN 0.7p.set(tau_m=rand_distr) # PyNN 0.8

Setting spike times

Where a single parameter value is already an array, e.g. spike times, this should be wrapped by a Sequence object.For example, to generate a different Poisson spike train for every neuron in a population of SpikeSourceArrays:

def generate_spike_times(i_range):return [Sequence(numpy.add.accumulate(numpy.random.exponential(10.0, size=10)))

for i in i_range]p = sim.Population(30, sim.SpikeSourceArray(spike_times=generate_spike_times))

Standardization of random distributions

Since its earliest versions PyNN has supported swapping in and out different random number generators, but until nowthere has been no standardization of these RNGs; for example the GSL random number library uses “gaussian” whereNumPy uses “normal”. This limited the usefulness of this feature, especially for the NativeRNG class, which signalsthat random number generation should be passed down to the simulator backend rather than being performed at thePython level.

17.8. PyNN 0.8.0 release notes 129

Page 134: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

This has now been fixed. The names of random number distributions and of their parameters have now been standard-ized, based for the most part on the nomenclature used by Wikipedia. A quick example:

from pyNN.random import NumpyRNG, GSLRNG, RandomDistribution

rd1 = RandomDistribution('normal' mu=0.5, sigma=0.1, rng=NumpyRNG(seed=922843))rd2 = RandomDistribution('normal' mu=0.5, sigma=0.1, rng=GSLRNG(seed=426482))

17.8.3 Recording

Previous versions of PyNN had three methods for recording from populations of neurons: record(), record_v()and record_gsyn(), for recording spikes, membrane potentials, and synaptic conductances, respectively. Therewas no official way to record any other state variables, for example the w variable from the adaptive-exponentialintegrate-and-fire model, or when using native, non-standard models, although there were workarounds.

In PyNN 0.8, we have replaced these three methods with a single record() method, which takes the variable torecord as its first argument, e.g.:

p.record() # PyNN 0.7p.record_v()p.record_gsyn()

becomes:

p.record('spikes') # PyNN 0.8p.record('v')p.record(['gsyn_exc', 'gsyn_inh'])

Note that (1) you can now choose to record the excitatory and inhibitory synaptic conductances separately, (2) you cangive a list of variables to record. For example, you can record all the variables for the EIF_cond_exp_isfa_istamodel in a single command using:

p.record(['spikes', 'v', 'w', 'gsyn_exc', 'gsyn_inh']) # PyNN 0.8

Note that the record_v() and record_gsyn() methods still exist, but their use is deprecated, and they will beremoved in the next version of PyNN.

A further change is that Population.record() has an optional sampling_interval argument, allowing recordingat intervals larger than the integration time step.

See Recording spikes and state variables for more details.

17.8.4 Retrieving recorded data

Perhaps the biggest change in PyNN 0.8 is that handling of recorded data, whether retrieval as Python objects or savingto file, now uses the Neo package, which provides a common Python object model for neurophysiology data (whetherreal or simulated).

Using Neo provides several advantages:

• data objects contain essential metadata, such as units, sampling interval, etc.;

• data can be saved to any of the file formats supported by Neo, including HDF5 and Matlab files;

• it is easier to handle data when running multiple simulations with the same network (calling reset() betweeneach one);

130 Chapter 17. Release notes

Page 135: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

• it is possible to save multiple signals to a single file;

• better interoperability with other Python packages using Neo (for data analysis, etc.).

Note that Neo is based on NumPy, and most Neo data objects sub-class the NumPy ndarray class, so much of yourdata handling code should work exactly the same as before.

See Data handling for more details.

17.8.5 Creating connections

In previous versions of PyNN, synaptic weights and delays were specified on creation of the Connector object. If thesynaptic weight had its own dynamics (whether short-term or spike-timing-dependent plasticity), the parameters forthis were specified on creation of a SynapseDynamics object. In other words, specification of synaptic parameterswas split across two different classes.

SynapseDynamics was also rather complex, and could have both a “fast” (for short-term synaptic depressionand facilitation) and “slow” (for long-term plasticity) component, although most simulator backends did not supportspecifying both fast and slow components at the same time.

In PyNN 0.8, all synaptic parameters including weights and delays are given as arguments to a SynapseType sub-class such as StaticSynapse or TsodyksMarkramSynapse. For example, instead of:

prj = Projection(p1, p2, AllToAllConnector(weights=0.05, delays=0.5)) # PyNN 0.7

you should now write:

prj = Projection(p1, p2, AllToAllConnector(), StaticSynapse(weight=0.05, delay=0.5))→˓# PyNN 0.8

and instead of:

params = {'U': 0.04, 'tau_rec': 100.0, 'tau_facil': 1000.0}facilitating = SynapseDynamics(fast=TsodyksMarkramMechanism(**params)) # PyNN 0.7prj = Projection(p1, p2, FixedProbabilityConnector(p_connect=0.1, weights=0.01),

synapse_dynamics=facilitating)

the following:

params = {'U': 0.04, 'tau_rec': 100.0, 'tau_facil': 1000.0, 'weight': 0.01}facilitating = TsodyksMarkramSynapse(**params) # PyNN 0.8prj = Projection(p1, p2, FixedProbabilityConnector(p_connect=0.1),

synapse_type=facilitating)

Note that “weights” and “delays” are now “weight” and “delay”. In addition, the “method” argument toProjection is now called “connector”, and the “target” argument is now “receptor_type”. The “rng” argu-ment has been moved from Projection to Connector, and the “space” argument of Connector has beenmoved to Projection.

The ability to specify both short-term and long-term plasticity for a given connection type, in a simulator-independentway, has been removed, although in practice only the NEURON backend supported this. This functionality will bereintroduced in PyNN 0.9. If you need this in the meantime, a workaround for the NEURON backend is to use aNativeSynapseType mechanism - ask on the mailing list for guidance.

Finally, the parameterization of STDP models has been modified. The A_plus and A_minus parameters have beenmoved from the weight-dependence components to the timing-dependence components, since effectively they describethe shape of the STDP curve independently of how the weight change depends on the current weight.

17.8. PyNN 0.8.0 release notes 131

Page 136: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

17.8.6 Specifying heterogeneous synapse parameters

As for neuron parameters, synapse parameter values can now be any of the following:

• a single number - sets the same value for all connections in the Projection;

• a RandomDistribution object - for each connection, sets a different random value drawn from the distri-bution;

• a list or 1D NumPy array of the same size as the Projection (although this is not very useful for randomnetworks, whose size may not be known in advance);

• a function that takes a single float argument; this function will be called with the distance between the pre- andpost-synaptic cell to return the parameter value for that cell.

17.8.7 Accessing, setting and saving properties of synaptic connections

In older versions of PyNN, the Projection class had a bunch of methods for working withsynaptic parameters: getWeights(), setWeights(), randomizeWeights(), printWeights(),getDelays(), setDelays(), randomizeDelays(), printDelays(), getSynapseDynamics(),setSynapseDynamics(), randomizeSynapseDynamics(), and saveConnections().

These have been replace by three methods: get(), set() and save(). The original methods still exist, but theiruse is deprecated and they will be removed in the next version of PyNN. You should update your code as follows:

prj.getWeights(format='list') # PyNN 0.7prj.get('weight', format='list', with_address=False) # PyNN 0.8

prj.randomizeDelays(rand_distr) # PyNN 0.7prj.set(delay=rand_distr) # PyNN 0.8

prj.setSynapseDynamics('tau_rec', 50.0) # PyNN 0.7prj.set(tau_rec=50.0) # PyNN 0.8

prj.printWeights('exc_weights.txt', format='array') # PyNN 0.7prj.save('weight', 'exc_weights.txt', format='array') # PyNN 0.8

prj.saveConnections('exc_conn.txt') # PyNN 0.7prj.save('all', 'exc_conn.txt', format='list') # PyNN 0.8

Also note that all three new methods can operate on several parameters at a time:

weights, delays = prj.getWeights('array'), prj.getDelays('array') # PyNN 0.7weights, delays = prj.get(['weight', 'delay'], format='array') # PyNN 0.8

prj.randomizeWeights(rand_distr); prj.setDelays(0.4) # PyNN 0.7prj.set(weight=rand_distr, delay=0.4) # PyNN 0.8

17.8.8 New and improved connectors

The library of Connector classes has been extended. The DistanceDependentProbabilityConnector(DDPC) has been generalized, resulting in the IndexBasedProbabilityConnector, with which the connec-tion probability can be specified as any function of the indices i and j of the pre- and post-synaptic neurons withintheir populations. In addition, the distance expression for the DDPC can now be a callable object (such as a function)as well as a string expression.

132 Chapter 17. Release notes

Page 137: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

The ArrayConnector allows connections to be specified as an explicit boolean matrix, with shape (m, n) where mis the size of the presynaptic population and n that of the postsynaptic population.

The CloneConnector takes the connection matrix from an existing Projection and uses it to create a newProjection, with the option of changing the weights, delays, receptor type, etc.

The FromListConnector and FromFileConnector now support specifying any synaptic parameter (e.g. pa-rameters of the synaptic plasticity rule), not just weight and delay.

The FixedNumberPostConnector now has an option with_replacement, which controls how the post-synapticpopulation is sampled, and affects the incidence of multiple connections between pairs of neurons (“multapses”).

We have added a version of CSAConnector for the NEST backend that passes down the CSA object to PyNEST’sCGConnect() function. This greatly speeds up CSAConnector with NEST.

17.8.9 Simulation control

Two new functions for advancing a simulation have been added: run_for() and run_until(). run_for() isjust an alias for run(). run_until() allows you to specify the absolute time at which a simulation should stop,rather than the increment of time. In addition, it is now possible to specify a call-back function that should be calledat intervals during a run, e.g.:

>>> def report_time(t):... print("The time is %g" % t)... return t + 100.0>>> run_until(300.0, callbacks=[report_time])The time is 0The time is 100The time is 200The time is 300

One potential use of this feature is to record synaptic weights during a simulation with synaptic plasticity.

The default value of the min_delay argument to setup() is now “auto”, which means that the simulator shouldcalculate the minimal synaptic delay itself. This change can lead to large speedups for NEST and NEURON code.

17.8.10 Simple plotting

We have added a small library to make it simple to produce simple plots of data recorded from a PyNN simulation.This is not intended for publication-quality or highly-customized plots, but for basic visualization.

For example:

from pyNN.utility.plotting import Figure, Panel

...

population.record('spikes')population[0:2].record(('v', 'gsyn_exc'))

...

data = population.get_data().segments[0]

vm = data.filter(name="v")[0]gsyn = data.filter(name="gsyn_exc")[0]

(continues on next page)

17.8. PyNN 0.8.0 release notes 133

Page 138: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

(continued from previous page)

Figure(Panel(vm, ylabel="Membrane potential (mV)"),Panel(gsyn, ylabel="Synaptic conductance (uS)"),Panel(data.spiketrains, xlabel="Time (ms)", xticks=True)

).save("simulation_results.png")

134 Chapter 17. Release notes

Page 139: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

17.8.11 Supported backends

PyNN 0.8.0 is compatible with NEST versions 2.6 to 2.8, NEURON versions 7.3 to 7.4, and Brian 1.4. Support forBrian 2 is planned for a future release.

Support for the PCSIM simulator has been dropped since the simulator appears to be no longer actively developed.

The default precision for the NEST backend has been changed to “off_grid”. This reflects the PyNN philosophy thatdefaults should prioritize accuracy and compatibility over performance. (We think performance is very important, it’sjust that any decision to risk compromising accuracy or interoperability should be made deliberately by the end user.)

The Izhikevich neuron model is now available for all backends.

17.8.12 Python compatibility

Support for Python 3 has been added (versions 3.3+). Support for Python versions 2.5 and earlier has been dropped.

17.8.13 Changes for developers

Other than the internal refactoring, the main change for developers is that we have switched from Subversion to Git.PyNN development now takes place at https://github.com/NeuralEnsemble/PyNN/ We are now taking advantage ofthe integration of GitHub with TravisCI, to automatically run the test suite whenever changes are pushed to GitHub.

17.9 PyNN 0.8.0 release candidate 1 release notes

August 19th 2015

Welcome to the first release candidate of PyNN 0.8.0!

For full information about what’s new in PyNN 0.8, see the PyNN 0.8 alpha 1 release notes, PyNN 0.8 beta 1 releasenotes and PyNN 0.8 beta 2 release notes

17.9.1 NEST 2.6

The main new feature in this release is support for NEST 2.6. Previous versions of NEST are no longer supported.

17.9.2 Other changes

• Travis CI now runs system tests as well as unit tests.

• Assorted bug fixes

17.10 PyNN 0.8 beta 2 release notes

January 6th 2015

Welcome to the second beta release of PyNN 0.8!

For full information about what’s new in PyNN 0.8, see the PyNN 0.8 alpha 1 release notes and PyNN 0.8 beta 1release notes.

17.9. PyNN 0.8.0 release candidate 1 release notes 135

Page 140: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

17.10.1 NEST 2.4

The main new feature in this release is support for NEST 2.4. Previous versions of NEST are no longer supported.

17.10.2 Python 3

With the rewrite of PyNEST in NEST 2.4, PyNN now has two backend simulators (NEURON being the other) thatsupport Python 3. There was really no longer any excuse not to add Python 3 support to PyNN, which turned out tobe very straightforward.

17.10.3 Standardization of random distributions

Since its earliest versions PyNN has supported swapping in and out different random number generators, but until nowthere has been no standardization of these RNGs; for example the GSL random number library uses “gaussian” whereNumPy uses “normal”. This limited the usefulness of this feature, especially for the NativeRNG class, which signalsthat random number generation should be passed down to the simulator backend rather than being done at the Pythonlevel.

This has now, finally, been fixed. The names of random number distributions and of their parameters have now beenstandardized, based for the most part on the nomenclature used by Wikipedia. A quick example:

from pyNN.random import NumpyRNG, GSLRNG, RandomDistribution

rd1 = RandomDistribution('normal' mu=0.5, sigma=0.1, rng=NumpyRNG(seed=922843))rd2 = RandomDistribution('normal' mu=0.5, sigma=0.1, rng=GSLRNG(seed=426482))

17.10.4 API changes

• Population.record() now has an optional sampling_interval argument, allowing recording at intervalslarger than the integration time step.

• FixedNumberPostConnector now has an option with_replacement, which controls how the post-synapticpopulation is sampled, and affects the incidence of multiple connections between pairs of neurons (“multapses”).

• The default value of the min_delay argument to setup() is now “auto”, which means that the simulator shouldcalculate the minimal synaptic delay itself. This change can lead to large speedups for NEST and NEURONcode.

17.10.5 Other changes

• Reimplemented Izhikevich model for NEURON to allow current injection.

• Connectors that can generate multiple connections between a given pair of neurons (“multapses”) now workproperly with the pyNN.nest backend.

• Added a version of CSAConnector for the NEST backend that passes down the CSA object to PyNEST’sCGConnect() function. This greatly speeds up CSAConnector with NEST.

• Added some new example scripts, deleted some of the more trivial, repetitive examples, and merged the severalvariants of the “VAbenchmarks” example into a single script.

• When data blocks from different MPI nodes are merged, the spike trains are now by default sorted by neuronID. If this sorting proves to be too time-consuming we can in future expose sort/don’t sort as an option.

136 Chapter 17. Release notes

Page 141: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

• Added IF_cond_exp_gsfa_grr standard model (integrate and fire neuron with spike frequency adaptionand relative refractory period) to Brian backend, and fixed broken HH_cond_exp model.

• Improvements to callback handling.

• Assorted bug fixes

17.11 PyNN 0.8 beta 1 release notes

November 15th 2013

Welcome to the first beta release of PyNN 0.8!

For full information about what’s new in PyNN 0.8, see the PyNN 0.8 alpha 1 release notes.

17.11.1 Brian backend

The main new feature in this beta release is the reappearance of the Brian backend, updated to work with the 0.8 API.You will need version 1.4 of Brian. There are still some rough edges, but we encourage anyone who has used Brianwith PyNN 0.7 to try updating your scripts now, and give it a try.

17.11.2 New and improved connectors

The library of Connector classes has been extended. The DistanceDependentProbabilityConnector(DDPC) has been generalized, resulting in the IndexBasedProbabilityConnector, with which the connec-tion probability can be specified as any function of the indices i and j of the pre- and post-synaptic neurons withintheir populations. In addition, the distance expression for the DDPC can now be a callable object (such as a function)as well as a string expression.

The ArrayConnector allows connections to be specified as an explicit boolean matrix, with shape (m, n) where mis the size of the presynaptic population and n that of the postsynaptic population.

The CSAConnector has been updated to work with the 0.8 API.

The FromListConnector and FromFileConnector now support specifying any synaptic parameter (e.g. pa-rameters of the synaptic plasticity rule), not just weight and delay.

17.11.3 API changes

The set() function now matches the Population.set() method, i.e. it takes one or more parameter name/valuepairs as keyword arguments.

Two new functions for advancing a simulation have been added: run_for() and run_until(). run_for() isjust an alias for run(). run_until() allows you to specify the absolute time at which a simulation should stop,rather than the increment of time. In addition, it is now possible to specify a call-back function that should be calledat intervals during a run, e.g.:

>>> def report_time(t):... print("The time is %g" % t)... return t + 100.0>>> run_until(300.0, callbacks=[report_time])The time is 0The time is 100

(continues on next page)

17.11. PyNN 0.8 beta 1 release notes 137

Page 142: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

(continued from previous page)

The time is 200The time is 300

One potential use of this feature is to record synaptic weights during a simulation with synaptic plasticity.

We have changed the parameterization of STDP models. The A_plus and A_minus parameters have been moved fromthe weight-dependence components to the timing-dependence components, since effectively they describe the shapeof the STDP curve independently of how the weight change depends on the current weight.

17.11.4 Simple plotting

We have added a small library to make it simple to produce simple plots of data recorded from a PyNN simulation.This is not intended for publication-quality or highly-customized plots, but for basic visualization.

For example:

from pyNN.utility.plotting import Figure, Panel

...

population.record('spikes')population[0:2].record(('v', 'gsyn_exc'))

...

data = population.get_data().segments[0]

vm = data.filter(name="v")[0]gsyn = data.filter(name="gsyn_exc")[0]

Figure(Panel(vm, ylabel="Membrane potential (mV)"),Panel(gsyn, ylabel="Synaptic conductance (uS)"),Panel(data.spiketrains, xlabel="Time (ms)", xticks=True)

).save("simulation_results.png")

138 Chapter 17. Release notes

Page 143: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

17.11.5 Gap junctions

The NEURON backend now supports gap junctions. This is not yet an official part of the PyNN API, since any officialfeature must be supported by at least two backends, but could be very useful to modellers using NEURON.

17.11.6 Other changes

The default precision for the NEST backend has been changed to “off_grid”. This reflects the PyNN philosophy thatdefaults should prioritize accuracy and compatibility over performance. (We think performance is very important, it’s

17.11. PyNN 0.8 beta 1 release notes 139

Page 144: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

just that any decision to risk compromising accurary or interoperability should be made deliberately by the end user.)

The Izhikevich neuron model is now available for the NEURON, NEST and Brian backends, although there are stillsome problems with injecting current when using the NEURON backend.

A whole bunch of bugs have been fixed: see the issue tracker.

17.11.7 For developers

We are now taking advantage of the integration of GitHub with TravisCI, to automatically run the suite of unit testswhenever changes are pushed to GitHub. Note that this does not run the system tests or any other tests that requireinstallation of a simulator backend.

17.12 PyNN 0.8 alpha 2 release notes

May 24th 2013

Welcome to the second alpha release of PyNN 0.8!

For full information about what’s new in PyNN 0.8, see the PyNN 0.8 alpha 1 release notes.

This second alpha is mostly just a bug-fix release, although we have added a new class, CloneConnector (thanksto Tom Close), which takes the connection matrix from an existing Projection and uses it to create a newProjection, with the option of changing the weights, delays, receptor type, etc.

The other big change for developers is that we have switched from Subversion to Git. PyNN development now takesplace at https://github.com/NeuralEnsemble/PyNN/

17.13 PyNN 0.8 alpha 1 release notes

July 31st 2012

Welcome to the first alpha release of PyNN 0.8! This is the first time there has been an alpha or beta release of PyNN.In the past it hasn’t seemed necessary, at first because few people were using PyNN for their research and those thatwere understood well that PyNN was in an early stage of development, more recently because most of the changeswere either extensions to the API or due to internal refactoring.

For PyNN 0.8 we have taken the opportunity to make significant, backward-incompatible changes to the API. The aimwas fourfold:

• to simplify the API, making it more consistent and easier to remember;

• to make the API more powerful, so more complex models can be expressed with less code;

• to allow a number of internal simplifications so it is easier for new developers to contribute;

• to prepare for planned future extensions, notably support for multi-compartmental models.

Since there have been so many changes, it seemed prudent to have a number of development releases before the finalrelease of 0.8.0, to get as much testing from users as possible. There may be more alpha releases, and there will be atleast one beta release.

This alpha release of PyNN is not intended for use in your research. If you have existing PyNN scripts, please installPyNN 0.8 alpha separately to your current PyNN installation (for example using virtualenv) and update your scripts,as outlined below, in a separate branch of your version control repository. If you find a bug, or if PyNN 0.8 alpha givesdifferent results to PyNN 0.7, please let us know using the bug tracker or on the mailing list.

140 Chapter 17. Release notes

Page 145: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

Warning: The first alpha release only supports NEURON and NEST. Support for Brian, PCSIM, NeuroML andMOOSE will be restored/added before the final 0.8.0 release.

17.13.1 Creating populations

In previous versions of PyNN, the Population constructor was called with the population size, a BaseCellTypesub-class such as IF_cond_exp and a dictionary of parameter values. For example:

p = Population(1000, IF_cond_exp, {'tau_m'=12.0, 'cm': 0.8}) # PyNN 0.7

This dictionary was passed to the cell-type class constructor within the Population constructor to create a cell-typeinstance.

The reason for doing this was that in early versions of PyNN, use of native NEST models was supported by passing astring, the model name, as the cell-type argument. Since PyNN 0.7, however, native models have been supported withthe NativeCellType class, and passing a string is no longer allowed.

It makes more sense, therefore, for the cell-type instance to be created by the user, and to pass a cell-type instance,rather than a cell-type class, to the Population constructor.

There is also a second change: specification of parameters for cell-type classes is now done via keyword argumentsrather than a single parameter dictionary. This is for consistency with current sources and synaptic plasticity models,which already use keyword arguments.

The example above should be rewritten as:

p = Population(1000, IF_cond_exp(tau_m=12.0, cm=0.8)) # PyNN 0.8

or:

p = Population(1000, IF_cond_exp(**{'tau_m'=12.0, 'cm': 0.8})) # PyNN 0.8

or:

cell_type = IF_cond_exp(tau_m=12.0, cm=0.8) # PyNN 0.8p = Population(1000, cell_type)

The first form, with a separate parameter dictionary, is still supported for the time being, but is deprecated and may beremoved in future versions.

17.13.2 Specifying heterogeneous parameter values

In previous versions of PyNN, the Population constructor supported setting parameters to either homogeneousvalues (all cells in the population have the same value) or random values. After construction, it was possible to changeparameters using the Population.set(), Population.tset() (for topographic set - parameters were set byusing an array of the same size as the population) and Population.rset() (for random set) methods.

In PyNN 0.8, setting parameters is simpler and more consistent, in that both when constructing a cell type for use inthe Population constructor (see above) and in the Population.set() method, parameter values can be any ofthe following:

• a single number - sets the same value for all cells in the Population;

• a RandomDistribution object - for each cell, sets a different random value drawn from the distribution;

• a list or 1D NumPy array of the same size as the Population;

17.13. PyNN 0.8 alpha 1 release notes 141

Page 146: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

• a function that takes a single integer argument; this function will be called with the index of every cell in thePopulation to return the parameter value for that cell.

See Model parameters and initial values for more details and examples.

The call signature of the Population.set() method has also been changed; now parameters should be specifiedas keyword arguments. For example, instead of:

p.set("tau_m": 20.0) # PyNN 0.7

use:

p.set(tau_m=20.0) # PyNN 0.8

and instead of:

p.set({"tau_m": 20.0, "v_rest": -65}) # PyNN 0.7

use:

p.set(tau_m=20.0, v_rest=-65) # PyNN 0.8

Now that Population.set() accepts random distributions and arrays as arguments, the Population.tset()and Population.rset() methods are superfluous. As of version 0.8, their use is deprecated and they will proba-bly be removed in the next version of PyNN. Their use can be replaced as follows:

p.tset("i_offset", arr) # PyNN 0.7p.set(i_offset=arr) # PyNN 0.8

p.rset("tau_m": rand_distr) # PyNN 0.7p.set(tau_m=rand_distr) # PyNN 0.8

Setting spike times

Where a single parameter value is already an array, e.g. spike times, this should be wrapped by a Sequence object.For example, to generate a different Poisson spike train for every neuron in a population of SpikeSourceArrays:

def generate_spike_times(i_range):return [Sequence(numpy.add.accumulate(numpy.random.exponential(10.0, size=10)))

for i in i_range]p = sim.Population(30, sim.SpikeSourceArray(spike_times=generate_spike_times))

17.13.3 Recording

Previous versions of PyNN had three methods for recording from populations of neurons: record(), record_v()and record_gsyn(), for recording spikes, membrane potentials, and synaptic conductances, respectively. Therewas no official way to record any other state variables, for example the w variable from the adaptive-exponentialintegrate-and-fire model, or when using native, non-standard models, although there were workarounds.

In PyNN 0.8, we have replaced these three methods with a single record() method, which takes the variable torecord as its first argument, e.g.:

p.record() # PyNN 0.7p.record_v()p.record_gsyn()

142 Chapter 17. Release notes

Page 147: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

becomes:

p.record('spikes') # PyNN 0.8p.record('v')p.record(['gsyn_exc', 'gsyn_inh'])

Note that (1) you can now choose to record the excitatory and inhibitory synaptic conductances separately,(2) you can give a list of variables to record, so, for example, you can record all the variables for theEIF_cond_exp_isfa_ista model in a single command using:

p.record(['spikes', 'v', 'w', 'gsyn_exc', 'gsyn_inh']) # PyNN 0.8

Note that the record_v() and record_gsyn() methods still exist, but their use is deprecated, and they will beremoved in the next version of PyNN.

See Recording spikes and state variables for more details.

17.13.4 Retrieving recorded data

Perhaps the biggest change in PyNN 0.8 is that handling of recorded data, whether retrieval as Python objects or savingto file, now uses the Neo package, which provides a common Python object model for neurophysiology data (whetherreal or simulated).

Using Neo provides several advantages:

• data objects contain essential metadata, such as units, sampling interval, etc.;

• data can be saved to any of the file formats supported by Neo, including HDF5 and Matlab files;

• it is easier to handle data when running multiple simulations with the same network (calling reset() betweeneach one);

• it is possible to save multiple signals to a single file;

• better interoperability with other Python packages using Neo (for data analysis, etc.).

Note that Neo is based on NumPy, and most Neo data objects sub-class the NumPy ndarray class, so much of yourdata handling code should work exactly the same as before.

See Data handling for more details.

17.13.5 Creating connections

In previous versions of PyNN, synaptic weights and delays were specified on creation of the Connector object. If thesynaptic weight had its own dynamics (whether short-term or spike-timing-dependent plasticity), the parameters forthis were specified on creation of a SynapseDynamics object. In other words, specification of synaptic parameterswas split across two different classes.

SynapseDynamics was also rather complex, and could have both a “fast” (for short-term synaptic depressionand facilitation) and “slow” (for long-term plasticity) component, although most simulator backends did not supportspecifying both fast and slow components at the same time.

In PyNN 0.8, all synaptic parameters including weights and delays are given as arguments to a SynapseType sub-class such as StaticSynapse or TsodyksMarkramSynapse. For example, instead of:

prj = Projection(p1, p2, AllToAllConnector(weights=0.05, delays=0.5)) # PyNN 0.7

you should now write:

17.13. PyNN 0.8 alpha 1 release notes 143

Page 148: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

prj = Projection(p1, p2, AllToAllConnector(), StaticSynapse(weight=0.05, delay=0.5))→˓# PyNN 0.8

and instead of:

params = {'U': 0.04, 'tau_rec': 100.0, 'tau_facil': 1000.0}facilitating = SynapseDynamics(fast=TsodyksMarkramMechanism(**params)) # PyNN 0.7prj = Projection(p1, p2, FixedProbabilityConnector(p_connect=0.1, weights=0.01),

synapse_dynamics=facilitating)

the following:

params = {'U': 0.04, 'tau_rec': 100.0, 'tau_facil': 1000.0, 'weight': 0.01}facilitating = TsodyksMarkramSynapse(**params)) # PyNN 0.8prj = Projection(p1, p2, FixedProbabilityConnector(p_connect=0.1),

synapse_type=facilitating)

Note that “weights” and “delays” are now “weight” and “delay”. In addition, the “method” argument toProjection is now called “connector”, and the “target” argument is now “receptor_type”. The “rng” argu-ment has been moved from Projection to Connector, and the “space” argument of Connector has beenmoved to Projection.

The ability to specify both short-term and long-term plasticity for a given connection type, in a simulator-independentway, has been removed, although in practice only the NEURON backend supported this. This functionality will bereintroduced in PyNN 0.9. If you need this in the meantime, a workaround for the NEURON backend is to use aNativeSynapseType mechanism - ask on the mailing list for guidance.

17.13.6 Specifying heterogeneous synapse parameters

As for neuron parameters, synapse parameter values can now be any of the following:

• a single number - sets the same value for all connections in the Projection;

• a RandomDistribution object - for each connection, sets a different random value drawn from the distri-bution;

• a list or 1D NumPy array of the same size as the Projection (although this is not very useful for randomnetworks, whose size may not be known in advance);

• a function that takes a single float argument; this function will be called with the distance between the pre- andpost-synaptic cell to return the parameter value for that cell.

17.13.7 Accessing, setting and saving properties of synaptic connections

In older versions of PyNN, the Projection class had a bunch of methods for working withsynaptic parameters: getWeights(), setWeights(), randomizeWeights(), printWeights(),getDelays(), setDelays(), randomizeDelays(), printDelays(), getSynapseDynamics(),setSynapseDynamics(), randomizeSynapseDynamics(), and saveConnections().

These have been replace by three methods: get(), set() and save(). The original methods still exist, but theiruse is deprecated and they will be removed in the next version of PyNN. You should update your code as follows:

prj.getWeights(format='list') # PyNN 0.7prj.get('weight', format='list', with_address=False) # PyNN 0.8

prj.randomizeDelays(rand_distr) # PyNN 0.7

(continues on next page)

144 Chapter 17. Release notes

Page 149: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

(continued from previous page)

prj.set(delay=rand_distr) # PyNN 0.8

prj.setSynapseDynamics('tau_rec', 50.0) # PyNN 0.7prj.set(tau_rec=50.0) # PyNN 0.8

prj.printWeights('exc_weights.txt', format='array') # PyNN 0.7prj.save('weight', 'exc_weights.txt', format='array') # PyNN 0.8

prj.saveConnections('exc_conn.txt') # PyNN 0.7prj.save('all', 'exc_conn.txt', format='list') # PyNN 0.8

Also note that all three new methods can operate on several parameters at a time:

weights, delays = prj.getWeights('array'), prj.getDelays('array') # PyNN 0.7weights, delays = prj.get(['weight', 'delay'], format='array') # PyNN 0.8

prj.randomizeWeights(rand_distr); prj.setDelays(0.4) # PyNN 0.7prj.set(weight=rand_distr, delay=0.4) # PyNN 0.8

17.13.8 Python compatibility

With an eye towards future support for Python 3, we have decided to drop support for Python versions 2.5 and earlierin PyNN 0.8.

17.14 PyNN 0.7 release notes

4th February 2011

This release sees a major extension of the API with the addition of the PopulationView and Assembly classes,which aim to make building large, structured networks much simpler and cleaner. A PopulationView allowsa sub-set of the neurons from a Population to be encapsulated in an object. We call it a “view”, rather than a“sub-population”, to emphasize the fact that the neurons are not copied: they are the same neurons as in the parentPopulation, and any operations on either view or parent (setting parameter values, recording, etc.) will be reflectedin the other. An Assembly is a list of Population and/or PopulationView objects, enabling multiple celltypes to be encapsulated in a single object. PopulationView and Assembly objects behave in most ways likePopulation: you can record them, connect them using a Projection, you can have views of views. . .

The “low-level API” (rechristened “procedural API”) has been reimplemented in in terms of Population andProjection. For example, create() now returns a Population object rather than a list of IDs, andconnect() returns a Projection object. This change should be almost invisible to the user, since Populationnow behaves very much like a list of IDs (can be sliced, joined, etc.).

There has been a major change to cell addressing: Populations now always store cells in a one-dimensional array,which means cells no longer have an address but just an index. To specify the spatial structure of a Population,pass a Structure object to the constructor, e.g.:

p = Population((12,10), IF_cond_exp)

is now:

p = Population(120, IF_cond_exp, structure=Grid2D(1.2))

although the former syntax still works, for backwards compatibility. The reasons for doing this are:

17.14. PyNN 0.7 release notes 145

Page 150: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

1. we can now have more interesting structures than just grids

2. efficiency (less juggling addresses, flattening)

3. simplicity (less juggling addresses, less code).

The API for setting initial values has changed: this is now done via the initialize() function or thePopulation.initialize() method, rather than by having v_init and similar parameters for cell models.

17.14.1 Other API changes

• simplification of the record_X() methods.With the addition of the PopulationView class, the selectionlogic implemented by the record_from and rng arguments duplicated that in Population.__getitem__()and Population.sample(), and so these arguments have been removed, and the record_X() methodsnow record all neurons within a Population, PopulationView or Assembly. Examples of syntaxchanges:

pop.record_v([pop[0], pop[17]]) --> pop[(0, 17)].record_v()pop.record(10, rng=rng) --> pop.sample(10, rng).record()

• enhanced describe() methods: can now use Jinja2 or Cheetah templating engines to produce much nicer,better formatted network descriptions.

• connections and neuron positions can now be saved to various binary formats as well as to text files.

• added some new connectors: SmallWorldConnector and CSAConnector (CSA = Connection Set Alge-bra)

• native neuron and synapse models are now supported using a NativeModelType subclass, rather than spec-ified as strings. This simplifies the code internally and increases the range of PyNN functionality that can beused with native models (e.g. you can now record any variable from a native NEST or NEURON model). ForNEST, there is a class factory native_cell_type(), for NEURON the NativeModelType subclasseshave to be written by hand.

17.14.2 Backend changes

• the NEST backend has been updated to work with NEST version 2.0.0.

• the Brian backend has seen extensive work on performance and on bringing it to feature parity with the otherbackends.

17.14.3 Details

• Where Population.initial_values() contains arrays, these arrays now consistently contain onlyenough values for local cells. Before, there was some inconsistency about how this was handled. Still needmore tests to be sure it’s really working as expected.

• Allow override of default_maxstep for NEURON backend as setup paramter. This is for the case that the userwants to add network connections across nodes after simulation start time.

• Discovered that when using NEST with mpi4py, you must import nest first and let it do the MPI initial-ization. The only time this seems to be a problem with PyNN is if a user imports pyNN.random beforepyNN.nest. It would be nice to handle this more gracefully, but for now I’ve just added a test that NEST andmpi4py agree on the rank, and a hopefully useful error message.

• Added a new setup() option for pyNN.nest: recording_precision. By default, recording_precision is 3 foron-grid and 15 for off-grid.

146 Chapter 17. Release notes

Page 151: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

• Partially fixed the pyNN.nest implementation of TsodyksMarkramMechanism (cf ticket:172). The‘tsodyks_synapse’ model has a ‘tau_psc’ parameter, which should be set to the same value as the decay timeconstant of the post-synaptic current (which is a parameter of the neuron model). I consider this only a partialfix, because if ‘tau_syn_E’ or ‘tau_syn_I’ is changed after the creation of the Projection, ‘tau_psc’ will not beupdated to match (unlike in the pyNN.neuron implementation. I’m also not sure how well it will work withnative neuron models.

• reverted pyNN.nest to reading/resetting the current time from the kernel rather than keeping track of it withinPyNN. NEST warns that this is dangerous, but all the tests pass, so let’s wait and see.

• In HH_cond_exp, conductances are now in µS, as for all other conductances in PyNN, instead of nS.

• NEURON now supports Tsodyks-Markram synapses for current-based exponential synapses (before it was onlyfor conductance-based).

• NEURON backend now supports the IF_cond_exp_gsfa_grr model.

• Added a sample() method to Population, which returns a PopulationView of a random sample ofthe neurons in the parent population.

• Added the EIF_cond_exp/alpha_isfa/ista and HH_cond_exp standard models in Brian.

• Added a gather option to the Population.get() method.

• brian.setup() now accepts a number of additional arguments in extra_params, For example,extra_params={'useweave': True} will lead to inline C++ code generation

• Wrote a first draft of a developers’ guide.

• Considerably extended the core.LazyArray class, as a basis for a possible rewrite of the connectors module.

• The random module now uses mpi4py to determine the MPI rank and num_processes, rather than receivingthese as arguments to the RNG constructor (see ticket:164).

• Many fixes and performance enhancements for the brian module, which now supports synaptic plasticity.

• No more GSL warning every time! Just raise an Exception if we attempt to use GSLRNG and pygsl is notavailable.

• Added some more flexibility to init_logging(): logfile=None -> stderr, format includes size & rank,user can override log-level

• NEST __init__.py changed to query NEST for filling NEST_SYNAPSE_TYPES.

• Started to move synapse dynamics related stuff out of Projection and into the synapse dynamics-relatedclasses, where it belongs.

• Added a new “spike_precision” option to nest.setup() (see http://neuralensemble.org/trac/PyNN/wiki/SimulatorSpecificOptions)

• Updated the NEST backend to work with version 2.0.0

• Rewrote the test suite, making a much cleaner distinction between unit tests, which now make heavy use of mockobjects to better-isolate components, and system tests. Test suite now runs with nose (https://nose.readthedocs.org/en/latest/), in order to facilitate continuous integration testing.

• Changed the format of connection files, as written by saveConnections() and read byFromFileConnector: files no longer contain the population label. Connections can now also be writ-ten to NumpyBinaryFile or PickleFile objects, instead of just text files. Same for Population.save_positions().

• Added CSAConnector, which wraps the Connection Set Algebra for use by PyNN. Requires the csa package:https://pypi.python.org/pypi/csa/

17.14. PyNN 0.7 release notes 147

Page 152: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

• Enhanced distance expressions by allowing expressions such as (d[0] < 0.1) & (d[1] < 0.2). Com-plex forms can therefore now be drawn, such as squares, ellipses, and so on.

• Added an n_connections flag to the DistanceDependentProbabiblityConnector in order to be ableto constrain the total number of connections. Can be useful for normalizations.

• Added a simple SmallWorldConnector. Cells are connected within a certain degree d. Then, all theconnections are rewired with a probability given by a rewiring parameter and new targets are uniformly selectedamong all the possible targets.

• Added a method to save cell positions to file.

• Added a progress bar to connectors. Now, a verbose flag allows to display or not a progress bar indicating thepercentage of connections established.

• New implementation of the connector classes, with much improved performance and scaling with MPI, andextension of distance-dependent weights and delays to all connectors. In addition, a safe flag has been added toall connectors: on by default, a user can turn it off to avoid tests on weights and delays.

• Added the ability to set the atol and rtol parameters of NEURON’s cvode solver in the extra_params argumentof setup() (patch from Johannes Partzsch).

• Made pyNN.nest’s handling of the refractory period consistent with the other backends. Made the defaultrefractory period 0.1 ms rather than 0.0 ms, since NEST appears not to handle zero refractory period.

• Moved standard model (cells and synapses) machinery, the Space class, and Error classes out of commoninto their own modules.

17.14.4 Release 0.7.1

This bug-fix release added copyright statements to all files, together with some minor bug fixes.

17.14.5 Release 0.7.2

17.14.6 Release 0.7.3

17.14.7 Release 0.7.4

17.14.8 Release 0.7.5

17.15 PyNN 0.6 release notes

14th February 2010

Welcome to PyNN 0.6!

There have been three major changes to the API in this version.

• Spikes, membrane potential and synaptic conductances can now be saved to file in various binary formats. Todo this, pass a PyNN File object to Population.print_X(), instead of a filename. There are varioustypes of PyNN File object, defined in the recording.files module, e.g., StandardTextFile,PickleFile, NumpyBinaryFile, HDF5ArrayFile.

• Added a reset() function and made the behaviour of setup() consistent across simulators. reset() setsthe simulation time to zero and sets membrane potentials to their initial values, but does not change the networkstructure. setup() destroys any previously defined network.

148 Chapter 17. Release notes

Page 153: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

• The possibility of expressing distance-dependent weights and delays was extended to theAllToAllConnector and FixedProbabilityConnector classes. To reduce the number of ar-guments to the constructors, the arguments affecting the spatial topology (periodic boundary conditions, etc.)were moved to a new Space class, so that only a single Space instance need be passed to the Connectorconstructor.

17.15.1 Details

• Switched to using the point process-based AdExp mechanism in NEURON.

• Factored out most of the commonality between the Recorder classes of each backend into a parent classrecording.Recorder, and tidied up the recording module.

• Added an attribute conductance_based to StandardCellType, to make the determination of synapsetype for a given cell more robust.

• PyNN now uses a named logger, which makes it easier to control logging levels when using PyNN within alarger application.

• implemented gather for Projection.saveConnections()

• Added a test script (test_mpi.py) to check whether serial and distributed simulations give the same results

• Added a size() method to Projection, to give the total number of connections across all nodes (unlike__len__(), which gives only the connections on the local node

• Speeded up record() by a huge factor (from 10 s for 12000 cells to less than 0.1 s) by removing an unecessaryconditional path (since all IDs now have an attribute “local”)

• synapse_type is now passed to the ConnectionManager constructor, not to the connect() method, since(a) it is fixed for a given connection manager, (b) it is needed in other methods than just connect(); fixedweight unit conversion in brian module.

• Updated connection handling in nest module to work with NEST version 1.9.8498. Will not now work withprevious NEST versions

• The neuron back-end now supports having both static and Tsodyks-Markram synapses on the same neuron(previously, the T-M synapses replaced the static synapses) - in agreement with nest and common sense.Thanks to Bartosz Telenczuk for reporting this.

• Added a compatible_output mode for the saveConnections() method. True by default, it allows con-nections to be reloaded from a file. If False, then the raw connections are stored, which makes for easierpostprocessing.

• Added an ACSource current source to the nest module.

• Fixed Hoc build directory problem in setup.py - see ticket:147

• Population.get_v() and the other “get” methods now return cell indices (starting from 0) rather than cellIDs. This behaviour now matches that of Population.print_v(), etc. See ticket:119 if you think this isa bad idea.

• Moved the base Connector class from common to connectors. Put the distances() function inside aSpace class, to allow more convenient specification of topology parameters.

• Projection.setWeights() and setDelays() now accept a 2D array argument (ref ticket:136), to besymmetric with getWeights() and getDelays(). For distributed simulations, each node only takes thevalues it needs from the array.

• FixedProbabilityConnector is now more strict, and checks that p_connect is less than 1 (seeticket:148). This makes no difference to the behaviour, but could act as a check for errors in user code.

17.15. PyNN 0.6 release notes 149

Page 154: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

• Fixed problem with changing SpikeSourcePoisson rate during a simulation (see ticket:152)

Note: This is the documentation for version 0.9.3. Earlier versions:

• version 0.8

• version 0.7 and earlier

150 Chapter 17. Release notes

Page 155: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

CHAPTER 18

Developers’ Guide

18.1 Developers’ guide

This guide contains information about contributing to PyNN development, and aims to explain the overall architectureand some of the internal details of the PyNN codebase.

PyNN is open-source software, with a community-based development model: contributions from users are welcomed,and the direction that PyNN development should take in the future is determined by the needs of its users.

There are several ways to contribute to PyNN:

• reporting bugs, errors and other mistakes in the code or documentation;

• making suggestions for improvements;

• fixing bugs and other mistakes;

• adding or maintaining a simulator backend;

• major refactoring to improve performance, reduce code complexity, or both.

• becoming a maintainer

The following sections contain guidelines for each of these.

18.1.1 Bug reports and feature requests

If you find a bug or would like to add a new feature to PyNN, please go to https://github.com/NeuralEnsemble/PyNN/issues/. First check that there is not an existing ticket for your bug or request, then click on “New issue” to create anew ticket (you will need a GitHub account, but creating one is simple and painless).

151

Page 156: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

18.1.2 Contributing to PyNN

Mailing list

Discussions about PyNN take place in the NeuralEnsemble Google Group.

Setting up a development environment

We strongly suggest you work in a virtual environment, e.g. using virtualenv or Anaconda.

Requirements

In addition to the requirements listed in Installation, you will need to install:

• nose

• mock

• coverage

to run tests, and:

• Sphinx

• matplotlib

to build the documentation.

Code checkout

PyNN development is based around GitHub. Once you have a GitHub account, you should fork the official PyNNrepository, and then clone your fork to your local machine:

$ git clone https://github.com/<username>/PyNN.git pyNN_dev

To work on the development version:

$ git checkout master

To work on the latest stable release (for bug-fixes):

$ git checkout --track origin/0.8

To keep your PyNN repository up-to-date with respect to the official repository, add it as a remote:

$ git remote add upstream https://github.com/NeuralEnsemble/PyNN.git

and then you can pull in any upstream changes:

$ git pull upstream master

To get PyNN onto your PYTHONPATH there are many options, such as:

• pip editable mode (pip install -e /path/to/PyNN)

• creating a symbolic link named pyNN from somewhere that is already on your PYTHONPATH, such as thesite-packages directory, to the pyNN_trunk/pyNN directory.

152 Chapter 18. Developers’ Guide

Page 157: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

If you are developing with NEURON, don’t forget to compile the NMODL files in pyNN/neuron/nmodl by run-ning nrnivmodl, and to recompile any time you modify any of them.

Coding style

We try to stay fairly close to PEP8. Please note in particular:

• indentation of four spaces, no tabs

• single space around most operators, but no space around the ‘=’ sign when used to indicate a keyword argumentor a default parameter value.

• some function/method names in PyNN use mixedCase, but these will gradually be deprecated and replacedwith lower_case_with_underscores. Any new functions or methods should use the latter.

• we currently target versions 2.7 and 3.3-3.6

Testing

Running the PyNN test suite requires the nose_, mock_ and nose-testconfig packages, and optionally the coverage_package. To run the entire test suite, in the test subdirectory of the source tree:

$ nosetests

To see how well the codebase is covered by the tests, run:

$ nosetests --with-coverage --cover-package=pyNN --cover-erase --cover-html

There are currently two sorts of tests, unit tests, which aim to exercise small pieces of code such as individual functionsand methods, and system tests, which aim to test that all the pieces of the system work together as expected.

If you add a new feature to PyNN, or fix a bug, you should write both unit and system tests.

Unit tests should where necessary make use of mock/fake/stub/dummy objects to isolate the component under test aswell as possible. The pyNN.mock module is a complete mock simulator backend that may be used for this purpose.Except when testing a specific simulator interface, unit tests should be able to run without a simulator installed.

System tests should be written so that they can run with any of the simulators. The suggested way to do this is towrite test functions, in a separate file, that take a simulator module as an argument, and then call these functions fromtest_neuron.py, test_nest.py, etc.

System tests defined in the scenarios directory are treated as a single test (test_scenarios()) while running nosetests.To run only the tests within a file named ‘test_electrodes’ located inside system/scenarios, use:

$ nosetests -s --tc=testFile:test_electrodes test_nest.py

To run a single specific test named ‘test_changing_electrode’ located within some file (and added to registry) insidesystem/scenarios, use:

$ nosetests -s --tc=testName:test_changing_electrode test_nest.py

Note that this would also run the tests specified within the simulator specific files such as test_brian.py, test_nest.pyand test_neuron.py. To avoid this, specify the ‘test_scenarios function’ on the command line:

$ nosetests -s --tc=testName:test_changing_electrode test_nest.py:test_scenarios

18.1. Developers’ guide 153

Page 158: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

The test/unsorted directory contains a number of old tests that are either no longer useful or have not yet beenadapted to the nose framework. These are not part of the test suite, but we are gradually adapting those tests that areuseful and deleting the others.

Submitting code

The best way to get started with contributing code to PyNN is to fix a small bug (bugs marked “minor” in the bugtracker) in your checkout of the code. Once you are happy with your changes, run the test suite again to check thatyou have not introduced any new bugs. If this is your first contribution to the project, please add your name andaffiliation/employer to AUTHORS.

After committing the changes to your local repository:

$ git commit -m 'informative commit message'

first pull in any changes from the upstream repository:

$ git pull upstream master

then push to your own account on GitHub:

$ git push

Now, via the GitHub web interface, open a pull request.

Documentation

PyNN documentation is generated using Sphinx.

To build the documentation in HTML format, run:

$ make html

in the doc subdirectory of the source tree. Many of the files contain examples of interactive Python sessions. Thevalidity of this code can be tested by running:

$ make doctest

PyNN documentation is hosted at http://neuralensemble.org/docs/PyNN

Making a release

To make a release of PyNN requires you to have permissions to upload PyNN packages to the Python Package Index,and to upload documentation to the neuralensemble.org server. If you are interested in becoming release manager forPyNN, please contact us via the mailing list.

When you think a release is ready, run through the following checklist one last time:

• do all the tests pass? This means running nosetests in test/unittests and test/system and run-ning make doctest in doc. You should do this on at least two Linux systems – one a very recent versionand one at least a year old, and on at least one version of Mac OS X. You should also do this with Python 2.7and 3.4, 3.5 or 3.6.

• do all the example scripts generate the correct output? Run the run_all_examples.py script inexamples/tools and then visually check the .png files generated in examples/tools/Results.Again, you should do this on at least two Linux systems and one Mac OS X system.

154 Chapter 18. Developers’ Guide

Page 159: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

• does the documentation build without errors? You should then at least skim the generated HTML pages to checkfor obvious problems.

• have you updated the version numbers in setup.py, pyNN/__init__.py, doc/conf.py and doc/installation.txt?

• have you updated the changelog?

Once you’ve confirmed all the above, create a source package using:

$ python setup.py sdist

and check that it installs properly (you will find it in the dist subdirectory.

Now you should commit any changes, then tag with the release number as follows:

$ git tag x.y.z

where x.y.z is the release number. You should now upload the documentation to http://neuralensemble.org/docs/PyNN/ by running:

$ make zip

in the doc directory, and then unpacking the resulting archive on the NeuralEnsemble server.

If this is a development release (i.e. an alpha or beta), the final step is to upload the source package to the INCFSoftware Center. Do not upload development releases to PyPI.

To upload a package to the INCF Software Center, log-in, and then go to the Contents tab. Click on “Add new. . . ”then “File”, then fill in the form and upload the source package.

If this is a final release, there are a few more steps:

• if it is a major release (i.e. an x.y.0 release), create a new bug-fix branch:

$ git branch x.y

• upload the source package to PyPI:

$ python setup.py sdist upload

• make an announcement on the mailing list

• if it is a major release, write a blog post about it with a focus on the new features and major changes

• go home, take a headache pill and lie down for a while in a darkened room (-;

18.1.3 Governance

PyNN is a community-developed project, we welcome contributions from anyone who is interested in the project. Theproject maintainers are the members of the PyNN Developers team. All contributors agree to abide by the Code ofConduct, see below.

Contributions

All contributions must be by pull request, with the exception of quick bug fixes affecting fewer than ten lines of code.Normally, pull requests may be approved by any maintainer, although anyone is welcome to join in the discussion.In case of disagreement with a decision, we will try to reach a consensus between maintainers, taking account ofany input from the wider community. If consensus cannot be reached, decisions will be based on a majority vote

18.1. Developers’ guide 155

Page 160: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

among the maintainers, with the caveats that (i) only one vote per institution is allowed (i.e. in the case where severalmaintainers belong to the same institution they will have to agree among themselves how to vote) and (ii) a quorum ofthree maintainers must be achieved.

Maintainers

Any contributor who has had at least three pull requests accepted may be nominated as a maintainer. Nominations mustbe approved by at least two existing maintainers, with no dissenting maintainer. In case of disagreement, decisions onaccepting new maintainers will be based on a majority vote as above.

Contributor Code of Conduct

Our Pledge

In the interest of fostering an open and welcoming environment, we as contributors and maintainers pledge to makingparticipation in our project and our community a harassment-free experience for everyone, regardless of age, bodysize, disability, ethnicity, gender identity and expression, level of experience, nationality, personal appearance, race,religion, or sexual identity and orientation.

Our Standards

Examples of behavior that contributes to creating a positive environment include:

• Using welcoming and inclusive language

• Being respectful of differing viewpoints and experiences

• Gracefully accepting constructive criticism

• Focusing on what is best for the community

• Showing empathy towards other community members

Examples of unacceptable behavior by participants include:

• The use of sexualized language or imagery and unwelcome sexual attention or advances

• Trolling, insulting/derogatory comments, and personal or political attacks

• Public or private harassment

• Publishing others’ private information, such as a physical or electronic address, without explicit permission

• Other conduct which could reasonably be considered inappropriate in a professional setting

Our Responsibilities

Project maintainers are responsible for clarifying the standards of acceptable behavior and are expected to take appro-priate and fair corrective action in response to any instances of unacceptable behavior.

Project maintainers have the right and responsibility to remove, edit, or reject comments, commits, code, wiki edits,issues, and other contributions that are not aligned to this Code of Conduct, or to ban temporarily or permanently anycontributor for other behaviors that they deem inappropriate, threatening, offensive, or harmful.

156 Chapter 18. Developers’ Guide

Page 161: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

Scope

This Code of Conduct applies both within project spaces and in public spaces when an individual is representing theproject or its community. Examples of representing a project or community include using an official project e-mailaddress, posting via an official social media account, or acting as an appointed representative at an online or offlineevent. Representation of a project may be further defined and clarified by project maintainers.

Enforcement

Instances of abusive, harassing, or otherwise unacceptable behavior may be reported by contacting the project team [email protected]. The project team will review and investigate all complaints, and will respond ina way that it deems appropriate to the circumstances. The project team is obligated to maintain confidentiality withregard to the reporter of an incident. Further details of specific enforcement policies may be posted separately.

Project maintainers who do not follow or enforce the Code of Conduct in good faith may face temporary or permanentrepercussions as determined by other members of the project’s leadership.

Attribution

This Code of Conduct is adapted from the Contributor Covenant, version 1.4, available at http://contributor-covenant.org/version/1/4

18.1. Developers’ guide 157

Page 162: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

158 Chapter 18. Developers’ Guide

Page 163: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

CHAPTER 19

API reference

19.1 API reference

19.1.1 Populations, Views and Assemblies

Populations

Views (sub-populations)

Assemblies

19.1.2 Connectors

Base class

class Connector(safe=True, callback=None)Base class for connectors.

All connector sub-classes have the following optional keyword arguments:

safe: if True, check that weights and delays have valid values. If False, this check is skipped.

callback: a function that will be called with the fractional progress of the connection routine. An examplewould be progress_bar.set_level.

connect(projection)

get_parameters()

describe(template=’connector_default.txt’, engine=’default’)Returns a human-readable description of the connection method.

The output may be customized by specifying a different template togther with an associated templateengine (see pyNN.descriptions).

159

Page 164: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

If template is None, then a dictionary containing the template context will be returned.

Built-in connectors

class AllToAllConnector(allow_self_connections=True, safe=True, callback=None)Connects all cells in the presynaptic population to all cells in the postsynaptic population.

Takes any of the standard Connector optional arguments and, in addition:

allow_self_connections: if the connector is used to connect a Population to itself, this flag deter-mines whether a neuron is allowed to connect to itself, or only to other neurons in the Popula-tion.

class OneToOneConnector(safe=True, callback=None)Where the pre- and postsynaptic populations have the same size, connect cell i in the presynaptic population tocell i in the postsynaptic population for all i.

Takes any of the standard Connector optional arguments.

class FixedProbabilityConnector(p_connect, allow_self_connections=True, rng=None,safe=True, callback=None)

For each pair of pre-post cells, the connection probability is constant.

Takes any of the standard Connector optional arguments and, in addition:

p_connect: a float between zero and one. Each potential connection is created with this probability.

allow_self_connections: if the connector is used to connect a Population to itself, this flag deter-mines whether a neuron is allowed to connect to itself, or only to other neurons in the Popula-tion.

rng: an RNG instance used to evaluate whether connections exist

class FromListConnector(conn_list, column_names=None, safe=True, callback=None)Make connections according to a list.

Arguments:

conn_list: a list of tuples, one tuple for each connection. Each tuple should contain: (pre_idx, post_idx,p1, p2, . . . , pn) where pre_idx is the index (i.e. order in the Population, not the ID) of the presynapticneuron, post_idx is the index of the postsynaptic neuron, and p1, p2, etc. are the synaptic parameters(e.g. weight, delay, plasticity parameters).

column_names: the names of the parameters p1, p2, etc. If not provided, it is assumed the parameters are‘weight’, ‘delay’ (for backwards compatibility). This should be specified using a tuple.

safe: if True, check that weights and delays have valid values. If False, this check is skipped.

callback: if True, display a progress bar on the terminal.

class FromFileConnector(file, distributed=False, safe=True, callback=None)Make connections according to a list read from a file.

Arguments:

file: either an open file object or the filename of a file containing a list of connections, in the formatrequired by FromListConnector. Column headers, if included in the file, must be specified using a listor tuple, e.g.:

# columns = ["i", "j", "weight", "delay", "U", "tau_rec"]

Note that the header requires # at the beginning of the line.

160 Chapter 19. API reference

Page 165: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

distributed: if this is True, then each node will read connections from a file called filename.x, where x isthe MPI rank. This speeds up loading connections for distributed simulations.

safe: if True, check that weights and delays have valid values. If False, this check is skipped.

callback: if True, display a progress bar on the terminal.

class ArrayConnector(array, safe=True, callback=None)Provide an explicit boolean connection matrix, with shape (m, n) where m is the size of the presynaptic popula-tion and n that of the postsynaptic population.

class FixedNumberPreConnector(n, allow_self_connections=True, with_replacement=False,rng=None, safe=True, callback=None)

Each post-synaptic neuron is connected to exactly n pre-synaptic neurons chosen at random.

The sampling behaviour is controlled by the with_replacement argument.

“With replacement” means that each pre-synaptic neuron is chosen from the entire population. There is alwaystherefore a possibility of multiple connections between a given pair of neurons.

“Without replacement” means that once a neuron has been selected, it cannot be selected again until the entirepopulation has been selected. This means that if n is less than the size of the pre-synaptic population, thereare no multiple connections. If n is greater than the size of the pre- synaptic population, all possible singleconnections are made before starting to add duplicate connections.

Takes any of the standard Connector optional arguments and, in addition:

n: either a positive integer, or a RandomDistribution that produces positive integers. If n is a Ran-domDistribution, then the number of pre-synaptic neurons is drawn from this distribution foreach post-synaptic neuron.

with_replacement: if True, the selection of neurons to connect is made from the entire population.If False, once a neuron is selected it cannot be selected again until the entire population has beenconnected.

allow_self_connections: if the connector is used to connect a Population to itself, this flag deter-mines whether a neuron is allowed to connect to itself, or only to other neurons in the Popula-tion.

rng: an RNG instance used to evaluate which potential connections are created.

class FixedNumberPostConnector(n, allow_self_connections=True, with_replacement=False,rng=None, safe=True, callback=None)

Each pre-synaptic neuron is connected to exactly n post-synaptic neurons chosen at random.

The sampling behaviour is controlled by the with_replacement argument.

“With replacement” means that each post-synaptic neuron is chosen from the entire population. There is alwaystherefore a possibility of multiple connections between a given pair of neurons.

“Without replacement” means that once a neuron has been selected, it cannot be selected again until the entirepopulation has been selected. This means that if n is less than the size of the post-synaptic population, thereare no multiple connections. If n is greater than the size of the post- synaptic population, all possible singleconnections are made before starting to add duplicate connections.

Takes any of the standard Connector optional arguments and, in addition:

n: either a positive integer, or a RandomDistribution that produces positive integers. If n is a Ran-domDistribution, then the number of post-synaptic neurons is drawn from this distribution foreach pre-synaptic neuron.

with_replacement: if True, the selection of neurons to connect is made from the entire population.If False, once a neuron is selected it cannot be selected again until the entire population has beenconnected.

19.1. API reference 161

Page 166: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

allow_self_connections: if the connector is used to connect a Population to itself, this flag deter-mines whether a neuron is allowed to connect to itself, or only to other neurons in the Popula-tion.

rng: an RNG instance used to evaluate which potential connections are created.

class FixedTotalNumberConnector(n, allow_self_connections=True, with_replacement=True,rng=None, safe=True, callback=None)

class DistanceDependentProbabilityConnector(d_expression, allow_self_connections=True,rng=None, safe=True, callback=None)

For each pair of pre-post cells, the connection probability depends on distance.

Takes any of the standard Connector optional arguments and, in addition:

d_expression: the right-hand side of a valid Python expression for probability, involving ‘d’, e.g.“exp(-abs(d))”, or “d<3”

allow_self_connections: if the connector is used to connect a Population to itself, this flag deter-mines whether a neuron is allowed to connect to itself, or only to other neurons in the Popula-tion.

rng: an RNG instance used to evaluate whether connections exist

class IndexBasedProbabilityConnector(index_expression, allow_self_connections=True,rng=None, safe=True, callback=None)

For each pair of pre-post cells, the connection probability depends on an arbitrary functions that takes the indicesof the pre and post populations.

Takes any of the standard Connector optional arguments and, in addition:

index_expression: a function that takes the two cell indices as inputs and calculates the probabilitymatrix from it.

allow_self_connections: if the connector is used to connect a Population to itself, this flag deter-mines whether a neuron is allowed to connect to itself, or only to other neurons in the Popula-tion.

rng: an RNG instance used to evaluate whether connections exist

class DisplacementDependentProbabilityConnector(disp_function, al-low_self_connections=True,rng=None, safe=True, call-back=None)

class SmallWorldConnector(degree, rewiring, allow_self_connections=True, n_connections=None,rng=None, safe=True, callback=None)

Connect cells so as to create a small-world network.

Takes any of the standard Connector optional arguments and, in addition:

degree: the region length where nodes will be connected locally.

rewiring: the probability of rewiring each edge.

allow_self_connections: if the connector is used to connect a Population to itself, this flag deter-mines whether a neuron is allowed to connect to itself, or only to other neurons in the Popula-tion.

n_connections: if specified, the number of efferent synaptic connections per neuron.

rng: an RNG instance used to evaluate which connections are created.

class CSAConnector(cset, safe=True, callback=None)Use the Connection Set Algebra (Djurfeldt, 2012) to connect cells.

162 Chapter 19. API reference

Page 167: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

Takes any of the standard Connector optional arguments and, in addition:

cset: a connection set object.

class CloneConnector(reference_projection, safe=True, callback=None)Connects cells with the same connectivity pattern as a previous projection.

19.1.3 Projections

19.1.4 Neuron models

PyNN provides a library of neuron models that have been standardized so as to give the same results (within certainlimits of numerical accuracy) on different backends. Each model is represented by a “cell type” class.

It is also possible to use simulator-specific neuron models, which we call “native” cell types. Of course, such modelswill only work with one specific backend simulator.

Note: the development version has some support for specifying cell types using the NineML and NeuroML formats,but this is not yet available in the current release.

Standard cell types

• Plain integrate-and-fire models:

– IF_curr_exp

– IF_curr_alpha

– IF_cond_exp

– IF_cond_alpha

• Integrate-and-fire with adaptation:

– IF_cond_exp_gsfa_grr

– EIF_cond_alpha_isfa_ista

– EIF_cond_exp_isfa_ista

– Izhikevich

• Hodgkin-Huxley model

– HH_cond_exp

• Spike sources (input neurons)

– SpikeSourcePoisson

– SpikeSourceArray

– SpikeSourceInhGamma

Base class

All standard cell types inherit from the following base class, and have the same methods, as listed below.

19.1. API reference 163

Page 168: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

class StandardCellType(**parameters)Bases: pyNN.standardmodels.StandardModelType, pyNN.models.BaseCellType

Base class for standardized cell model classes.

get_schema()Returns the model schema: i.e. a mapping of parameter names to allowed parameter types.

classmethod get_parameter_names()Return the names of the parameters of this model.

get_native_names(*names)Return a list of native parameter names for a given model.

classmethod has_parameter(name)Does this model have a parameter with the given name?

translate(parameters, copy=True)Translate standardized model parameters to simulator-specific parameters.

reverse_translate(native_parameters)Translate simulator-specific model parameters to standardized parameters.

simple_parameters()Return a list of parameters for which there is a one-to-one correspondance between standard and nativeparameter values.

scaled_parameters()Return a list of parameters for which there is a unit change between standard and native parameter values.

computed_parameters()Return a list of parameters whose values must be computed from more than one other parameter.

describe(template=’modeltype_default.txt’, engine=’default’)Returns a human-readable description of the cell or synapse type.

The output may be customized by specifying a different template togther with an associated templateengine (see pyNN.descriptions).

If template is None, then a dictionary containing the template context will be returned.

Simple integrate-and-fire neurons

class IF_cond_exp(**parameters)Bases: pyNN.standardmodels.StandardCellType

Leaky integrate and fire model with fixed threshold and exponentially-decaying post-synaptic conductance.

injectable = True

conductance_based = True

default_parameters = {'cm': 1.0, 'e_rev_E': 0.0, 'e_rev_I': -70.0, 'i_offset': 0.0, 'tau_m': 20.0, 'tau_refrac': 0.1, 'tau_syn_E': 5.0, 'tau_syn_I': 5.0, 'v_reset': -65.0, 'v_rest': -65.0, 'v_thresh': -50.0}

recordable = ['spikes', 'v', 'gsyn_exc', 'gsyn_inh']

default_initial_values = {'gsyn_exc': 0.0, 'gsyn_inh': 0.0, 'v': -65.0}

units = {'cm': 'nF', 'e_rev_E': 'mV', 'e_rev_I': 'mV', 'gsyn_exc': 'uS', 'gsyn_inh': 'uS', 'i_offset': 'nA', 'tau_m': 'ms', 'tau_refrac': 'ms', 'tau_syn_E': 'ms', 'tau_syn_I': 'ms', 'v': 'mV', 'v_reset': 'mV', 'v_rest': 'mV', 'v_thresh': 'mV'}

class IF_cond_alpha(**parameters)Bases: pyNN.standardmodels.StandardCellType

Leaky integrate and fire model with fixed threshold and alpha-function- shaped post-synaptic conductance.

164 Chapter 19. API reference

Page 169: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

injectable = True

conductance_based = True

default_parameters = {'cm': 1.0, 'e_rev_E': 0.0, 'e_rev_I': -70.0, 'i_offset': 0.0, 'tau_m': 20.0, 'tau_refrac': 0.1, 'tau_syn_E': 0.3, 'tau_syn_I': 0.5, 'v_reset': -65.0, 'v_rest': -65.0, 'v_thresh': -50.0}

recordable = ['spikes', 'v', 'gsyn_exc', 'gsyn_inh']

default_initial_values = {'gsyn_exc': 0.0, 'gsyn_inh': 0.0, 'v': -65.0}

units = {'cm': 'nF', 'e_rev_E': 'mV', 'e_rev_I': 'mV', 'gsyn_exc': 'uS', 'gsyn_inh': 'uS', 'i_offset': 'nA', 'tau_m': 'ms', 'tau_refrac': 'ms', 'tau_syn_E': 'ms', 'tau_syn_I': 'ms', 'v': 'mV', 'v_reset': 'mV', 'v_rest': 'mV', 'v_thresh': 'mV'}

class IF_curr_exp(**parameters)Bases: pyNN.standardmodels.StandardCellType

Leaky integrate and fire model with fixed threshold and decaying-exponential post-synaptic current. (Separatesynaptic currents for excitatory and inhibitory synapses.

injectable = True

conductance_based = False

default_parameters = {'cm': 1.0, 'i_offset': 0.0, 'tau_m': 20.0, 'tau_refrac': 0.1, 'tau_syn_E': 5.0, 'tau_syn_I': 5.0, 'v_reset': -65.0, 'v_rest': -65.0, 'v_thresh': -50.0}

recordable = ['spikes', 'v']

conductance_based = False

default_initial_values = {'isyn_exc': 0.0, 'isyn_inh': 0.0, 'v': -65.0}

units = {'cm': 'nF', 'i_offset': 'nA', 'isyn_exc': 'nA', 'isyn_inh': 'nA', 'tau_m': 'ms', 'tau_refrac': 'ms', 'tau_syn_E': 'ms', 'tau_syn_I': 'ms', 'v': 'mV', 'v_reset': 'mV', 'v_rest': 'mV', 'v_thresh': 'mV'}

class IF_curr_alpha(**parameters)Bases: pyNN.standardmodels.StandardCellType

Leaky integrate and fire model with fixed threshold and alpha-function- shaped post-synaptic current.

injectable = True

conductance_based = False

default_parameters = {'cm': 1.0, 'i_offset': 0.0, 'tau_m': 20.0, 'tau_refrac': 0.1, 'tau_syn_E': 0.5, 'tau_syn_I': 0.5, 'v_reset': -65.0, 'v_rest': -65.0, 'v_thresh': -50.0}

recordable = ['spikes', 'v']

conductance_based = False

default_initial_values = {'isyn_exc': 0.0, 'isyn_inh': 0.0, 'v': -65.0}

units = {'cm': 'nF', 'i_offset': 'nA', 'isyn_exc': 'nA', 'isyn_inh': 'nA', 'tau_m': 'ms', 'tau_refrac': 'ms', 'tau_syn_E': 'ms', 'tau_syn_I': 'ms', 'v': 'mV', 'v_reset': 'mV', 'v_rest': 'mV', 'v_thresh': 'mV'}

Integrate-and-fire neurons with adaptation

class Izhikevich(**parameters)Bases: pyNN.standardmodels.StandardCellType

Izhikevich spiking model with a quadratic non-linearity according to:

5. Izhikevich (2003), IEEE transactions on neural networks, 14(6)

dv/dt = 0.04*v^2 + 5*v + 140 - u + I du/dt = a*(b*v - u)

Synapses are modeled as Dirac delta currents (voltage step), as in the original model

NOTE: name should probably be changed to match standard nomenclature, e.g. QIF_cond_delta_etc_etc, al-though keeping “Izhikevich” as an alias would be good

19.1. API reference 165

Page 170: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

injectable = True

conductance_based = False

default_parameters = {'a': 0.02, 'b': 0.2, 'c': -65.0, 'd': 2.0, 'i_offset': 0.0}

recordable = ['spikes', 'v', 'u']

conductance_based = False

voltage_based_synapses = True

default_initial_values = {'u': -14.0, 'v': -70.0}

units = {'a': '/ms', 'b': '/ms', 'c': 'mV', 'd': 'mV/ms', 'i_offset': 'nA', 'u': 'mV/ms', 'v': 'mV'}

class EIF_cond_exp_isfa_ista(**parameters)Bases: pyNN.standardmodels.StandardCellType

Exponential integrate and fire neuron with spike triggered and sub-threshold adaptation currents (isfa, ista reps.)according to:

Brette R and Gerstner W (2005) Adaptive Exponential Integrate-and-Fire Model as an Effective Description ofNeuronal Activity. J Neurophysiol 94:3637-3642

See also: IF_cond_exp_gsfa_grr, EIF_cond_alpha_isfa_ista

injectable = True

conductance_based = True

default_parameters = {'a': 4.0, 'b': 0.0805, 'cm': 0.281, 'delta_T': 2.0, 'e_rev_E': 0.0, 'e_rev_I': -80.0, 'i_offset': 0.0, 'tau_m': 9.3667, 'tau_refrac': 0.1, 'tau_syn_E': 5.0, 'tau_syn_I': 5.0, 'tau_w': 144.0, 'v_reset': -70.6, 'v_rest': -70.6, 'v_spike': -40.0, 'v_thresh': -50.4}

recordable = ['spikes', 'v', 'w', 'gsyn_exc', 'gsyn_inh']

default_initial_values = {'gsyn_exc': 0.0, 'gsyn_inh': 0.0, 'v': -70.6, 'w': 0.0}

units = {'a': 'nS', 'b': 'nA', 'cm': 'nF', 'delta_T': 'mV', 'e_rev_E': 'mV', 'e_rev_I': 'mV', 'gsyn_exc': 'uS', 'gsyn_inh': 'uS', 'i_offset': 'nA', 'tau_m': 'ms', 'tau_refrac': 'ms', 'tau_syn_E': 'ms', 'tau_syn_I': 'ms', 'tau_w': 'ms', 'v': 'mV', 'v_reset': 'mV', 'v_rest': 'mV', 'v_spike': 'mV', 'v_thresh': 'mV', 'w': 'nA'}

class EIF_cond_alpha_isfa_ista(**parameters)Bases: pyNN.standardmodels.StandardCellType

Exponential integrate and fire neuron with spike triggered and sub-threshold adaptation currents (isfa, ista reps.)according to:

Brette R and Gerstner W (2005) Adaptive Exponential Integrate-and-Fire Model as an Effective Description ofNeuronal Activity. J Neurophysiol 94:3637-3642

See also: IF_cond_exp_gsfa_grr, EIF_cond_exp_isfa_ista

injectable = True

conductance_based = True

default_parameters = {'a': 4.0, 'b': 0.0805, 'cm': 0.281, 'delta_T': 2.0, 'e_rev_E': 0.0, 'e_rev_I': -80.0, 'i_offset': 0.0, 'tau_m': 9.3667, 'tau_refrac': 0.1, 'tau_syn_E': 5.0, 'tau_syn_I': 5.0, 'tau_w': 144.0, 'v_reset': -70.6, 'v_rest': -70.6, 'v_spike': -40.0, 'v_thresh': -50.4}

recordable = ['spikes', 'v', 'w', 'gsyn_exc', 'gsyn_inh']

default_initial_values = {'gsyn_exc': 0.0, 'gsyn_inh': 0.0, 'v': -70.6, 'w': 0.0}

units = {'a': 'nS', 'b': 'nA', 'cm': 'nF', 'delta_T': 'mV', 'e_rev_E': 'mV', 'e_rev_I': 'mV', 'gsyn_exc': 'uS', 'gsyn_inh': 'uS', 'i_offset': 'nA', 'tau_m': 'ms', 'tau_refrac': 'ms', 'tau_syn_E': 'ms', 'tau_syn_I': 'ms', 'tau_w': 'ms', 'v': 'mV', 'v_reset': 'mV', 'v_rest': 'mV', 'v_spike': 'mV', 'v_thresh': 'mV', 'w': 'nA'}

class IF_cond_exp_gsfa_grr(**parameters)Bases: pyNN.standardmodels.StandardCellType

Linear leaky integrate and fire model with fixed threshold, decaying-exponential post-synaptic conductance,conductance based spike-frequency adaptation, and a conductance-based relative refractory mechanism.

166 Chapter 19. API reference

Page 171: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

See: Muller et al (2007) Spike-frequency adapting neural ensembles: Beyond mean-adaptation and renewaltheories. Neural Computation 19: 2958-3010.

See also: EIF_cond_alpha_isfa_ista

injectable = True

conductance_based = True

default_parameters = {'cm': 1.0, 'e_rev_E': 0.0, 'e_rev_I': -70.0, 'e_rev_rr': -75.0, 'e_rev_sfa': -75.0, 'i_offset': 0.0, 'q_rr': 3000.0, 'q_sfa': 15.0, 'tau_m': 20.0, 'tau_refrac': 0.1, 'tau_rr': 2.0, 'tau_sfa': 100.0, 'tau_syn_E': 5.0, 'tau_syn_I': 5.0, 'v_reset': -65.0, 'v_rest': -65.0, 'v_thresh': -50.0}

recordable = ['spikes', 'v', 'g_r', 'g_s', 'gsyn_exc', 'gsyn_inh']

default_initial_values = {'g_r': 0.0, 'g_s': 0.0, 'gsyn_exc': 0.0, 'gsyn_inh': 0.0, 'v': -65.0}

units = {'cm': 'nF', 'e_rev_E': 'mV', 'e_rev_I': 'mV', 'e_rev_rr': 'mV', 'e_rev_sfa': 'mV', 'g_r': 'nS', 'g_s': 'nS', 'gsyn_exc': 'uS', 'gsyn_inh': 'uS', 'i_offset': 'nA', 'q_rr': 'nS', 'q_sfa': 'nS', 'tau_m': 'ms', 'tau_refrac': 'ms', 'tau_rr': 'ms', 'tau_sfa': 'ms', 'tau_syn_E': 'ms', 'tau_syn_I': 'ms', 'v': 'mV', 'v_reset': 'mV', 'v_rest': 'mV', 'v_thresh': 'mV'}

Spike sources

class SpikeSourcePoisson(**parameters)Bases: pyNN.standardmodels.StandardCellType

Spike source, generating spikes according to a Poisson process.

injectable = False

conductance_based = True

default_parameters = {'duration': 10000000000.0, 'rate': 1.0, 'start': 0.0}

recordable = ['spikes']

injectable = False

receptor_types = ()

units = {'duration': 'ms', 'rate': 'Hz', 'start': 'ms'}

class SpikeSourceArray(**parameters)Bases: pyNN.standardmodels.StandardCellType

Spike source generating spikes at the times given in the spike_times array.

injectable = False

conductance_based = True

default_parameters = {'spike_times': Sequence([])}

recordable = ['spikes']

injectable = False

receptor_types = ()

units = {'spike_times': 'ms'}

class SpikeSourceInhGamma(**parameters)Bases: pyNN.standardmodels.StandardCellType

Spike source, generating realizations of an inhomogeneous gamma process, employing the thinning method.

See: Muller et al (2007) Spike-frequency adapting neural ensembles: Beyond mean-adaptation and renewaltheories. Neural Computation 19: 2958-3010.

injectable = False

19.1. API reference 167

Page 172: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

conductance_based = True

default_parameters = {'a': Sequence([ 1.]), 'b': Sequence([ 1.]), 'duration': 10000000000.0, 'start': 0.0, 'tbins': Sequence([ 0.])}

recordable = ['spikes']

injectable = False

receptor_types = ()

units = {'a': 'dimensionless', 'b': 's', 'duration': 'ms', 'start': 'ms', 'tbins': 'ms'}

Native cell types

Todo: WRITE THIS PART

Utility functions

19.1.5 Synapse models

The synaptic connection between two neurons is represented as a “synapse type” class. Note that synaptic attributesthat belong solely to the post-synaptic neuron, such as the decay of the post-synaptic conductance, are part of the celltype model. The “synapse type” models the synaptic delay, the synaptic weight, and any dynamic behaviour of thesynaptic weight, i.e. synaptic plasticity.

As for cell types, PyNN has a library of “standard” synapse types that should give the same behaviour on differentsimulators, and also supports the use of “native” synapse types, limited to a single simulator.

Standard synapse types

Base class

All standard cell types inherit from the following base class, and have the same methods, as listed below.

class StandardSynapseType(**parameters)Bases: pyNN.standardmodels.StandardModelType, pyNN.models.BaseSynapseType

parameter_checks = {'delay': <function check_delays at 0x7fbc8a1a2398>, 'weight': <function check_weights at 0x7fbc8a193aa0>}

get_schema()Returns the model schema: i.e. a mapping of parameter names to allowed parameter types.

computed_parameters()Return a list of parameters whose values must be computed from more than one other parameter.

connection_type = None

default_initial_values = {}

default_parameters = {}

describe(template=’modeltype_default.txt’, engine=’default’)Returns a human-readable description of the cell or synapse type.

The output may be customized by specifying a different template togther with an associated templateengine (see pyNN.descriptions).

168 Chapter 19. API reference

Page 173: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

If template is None, then a dictionary containing the template context will be returned.

extra_parameters = {}

get_native_names(*names)Return a list of native parameter names for a given model.

classmethod get_parameter_names()Return the names of the parameters of this model.

classmethod has_parameter(name)Does this model have a parameter with the given name?

has_presynaptic_components = False

native_parametersA ParameterSpace containing parameter names and values translated from the standard PyNN namesand units to simulator-specific (“native”) names and units.

reverse_translate(native_parameters)Translate simulator-specific model parameters to standardized parameters.

scaled_parameters()Return a list of parameters for which there is a unit change between standard and native parameter values.

simple_parameters()Return a list of parameters for which there is a one-to-one correspondance between standard and nativeparameter values.

translate(parameters, copy=True)Translate standardized model parameters to simulator-specific parameters.

translations = {}

Static/fixed synapses

class StaticSynapse(**parameters)Bases: pyNN.standardmodels.StandardSynapseType

Synaptic connection with fixed weight and delay.

default_parameters = {'delay': None, 'weight': 0.0}

Short-term plasticity mechanisms

class TsodyksMarkramSynapse(**parameters)Bases: pyNN.standardmodels.StandardSynapseType

Synapse exhibiting facilitation and depression, implemented using the model of Tsodyks, Markram et al.:

Tsodyks, Uziel and Markram (2000) Synchrony Generation in Recurrent Networks with Frequency-DependentSynapses. Journal of Neuroscience 20:RC50

Note that the time constant of the post-synaptic current is set in the neuron model, not here.

Arguments:

U: use parameter.

tau_rec: depression time constant (ms).

tau_facil: facilitation time constant (ms).

19.1. API reference 169

Page 174: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

default_parameters = {'U': 0.5, 'delay': None, 'tau_facil': 0.0, 'tau_rec': 100.0, 'weight': 0.0}

default_initial_values = {'u': 0.0}

Long-term plasticity mechanisms

class STDPMechanism(timing_dependence=None, weight_dependence=None, volt-age_dependence=None, dendritic_delay_fraction=1.0, weight=0.0, de-lay=None)

Bases: pyNN.standardmodels.StandardSynapseType

A specification for an STDP mechanism, combining a weight-dependence, a timing-dependence, and, option-ally, a voltage-dependence of the synaptic change.

For point neurons, the synaptic delay d can be interpreted either as occurring purely in the pre-synaptic axon +synaptic cleft, in which case the synaptic plasticity mechanism ‘sees’ the post-synaptic spike immediately andthe pre-synaptic spike after a delay d (dendritic_delay_fraction = 0) or as occurring purely in the post- synapticdendrite, in which case the pre-synaptic spike is seen immediately, and the post-synaptic spike after a delay d(dendritic_delay_fraction = 1), or as having both pre- and post- synaptic components (dendritic_delay_fractionbetween 0 and 1).

In a future version of the API, we will allow the different components of the synaptic delay to be specifiedseparately in milliseconds.

model

possible_modelsA list of available synaptic plasticity models for the current configuration (weight dependence, timingdependence, . . . ) in the current simulator.

get_parameter_names()Return the names of the parameters of this model.

has_parameter(name)Does this model have a parameter with the given name?

get_schema()Returns the model schema: i.e. a mapping of parameter names to allowed parameter types.

parameter_space

native_parametersA dictionary containing the combination of parameters from the different components of the STDP model.

describe(template=’stdpmechanism_default.txt’, engine=’default’)Returns a human-readable description of the STDP mechanism.

The output may be customized by specifying a different template togther with an associated templateengine (see pyNN.descriptions).

If template is None, then a dictionary containing the template context will be returned.

Weight-dependence components

class STDPWeightDependence(**parameters)Bases: pyNN.standardmodels.StandardModelType

Base class for models of STDP weight dependence.

170 Chapter 19. API reference

Page 175: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

describe(template=’modeltype_default.txt’, engine=’default’)Returns a human-readable description of the cell or synapse type.

The output may be customized by specifying a different template togther with an associated templateengine (see pyNN.descriptions).

If template is None, then a dictionary containing the template context will be returned.

class AdditiveWeightDependence(w_min=0.0, w_max=1.0)Bases: pyNN.standardmodels.STDPWeightDependence

The amplitude of the weight change is independent of the current weight. If the new weight would be less thanw_min it is set to w_min. If it would be greater than w_max it is set to w_max.

Arguments:

w_min: minimum synaptic weight, in the same units as the weight, i.e. µS or nA.

w_max: maximum synaptic weight.

default_parameters = {'w_max': 1.0, 'w_min': 0.0}

class MultiplicativeWeightDependence(w_min=0.0, w_max=1.0)Bases: pyNN.standardmodels.STDPWeightDependence

The amplitude of the weight change depends on the current weight. For depression, ∆w w - w_min Forpotentiation, ∆w w_max - w

Arguments:

w_min: minimum synaptic weight, in the same units as the weight, i.e. µS or nA.

w_max: maximum synaptic weight.

default_parameters = {'w_max': 1.0, 'w_min': 0.0}

class AdditivePotentiationMultiplicativeDepression(w_min=0.0, w_max=1.0)Bases: pyNN.standardmodels.STDPWeightDependence

The amplitude of the weight change depends on the current weight for depression (∆w w) and is fixed forpotentiation.

Arguments:

w_min: minimum synaptic weight, in the same units as the weight, i.e. µS or nA.

w_max: maximum synaptic weight.

default_parameters = {'w_max': 1.0, 'w_min': 0.0}

class GutigWeightDependence(w_min=0.0, w_max=1.0, mu_plus=0.5, mu_minus=0.5)Bases: pyNN.standardmodels.STDPWeightDependence

The amplitude of the weight change depends on (w_max-w)^mu_plus for potentiation and (w-w_min)^mu_minus for depression.

Arguments:

w_min: minimum synaptic weight, in the same units as the weight, i.e. µS or nA.

w_max: maximum synaptic weight.

mu_plus: see above

mu_minus: see above

default_parameters = {'mu_minus': 0.5, 'mu_plus': 0.5, 'w_max': 1.0, 'w_min': 0.0}

19.1. API reference 171

Page 176: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

Timing-dependence components

class STDPTimingDependence(**parameters)Bases: pyNN.standardmodels.StandardModelType

Base class for models of STDP timing dependence (triplets, etc)

class SpikePairRule(tau_plus=20.0, tau_minus=20.0, A_plus=0.01, A_minus=0.01)Bases: pyNN.standardmodels.STDPTimingDependence

The amplitude of the weight change depends only on the relative timing of spike pairs, not triplets, etc. Allpossible spike pairs are taken into account (cf Song and Abbott).

Arguments:

tau_plus: time constant of the positive part of the STDP curve, in milliseconds.

tau_minus time constant of the negative part of the STDP curve, in milliseconds.

A_plus: amplitude of the positive part of the STDP curve.

A_minus: amplitude of the negative part of the STDP curve.

describe(template=’modeltype_default.txt’, engine=’default’)Returns a human-readable description of the cell or synapse type.

The output may be customized by specifying a different template togther with an associated templateengine (see pyNN.descriptions).

If template is None, then a dictionary containing the template context will be returned.

default_parameters = {'A_minus': 0.01, 'A_plus': 0.01, 'tau_minus': 20.0, 'tau_plus': 20.0}

Native plasticity models

NineML plasticity models

19.1.6 Current sources

19.1.7 Simulation control

19.1.8 Random numbers

class RandomDistribution(distribution, parameters_pos=None, rng=None, **parameters_named)Class which defines a next(n) method which returns an array of n random numbers from a given distribution.

Arguments:

distribution: the name of a random number distribution.

parameters_pos: parameters of the distribution, provided as a tuple. For the correct ordering, see ran-dom.available_distributions.

rng: if present, should be a NumpyRNG, GSLRNG or NativeRNG object.

parameters_named: parameters of the distribution, provided as keyword arguments.

Parameters may be provided either through parameters_pos or through parameters_named, but not both. Allparameters must be provided, there are no default values. Parameter names are, in general, as used in Wikipedia.

Examples:

172 Chapter 19. API reference

Page 177: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

>>> rd = RandomDistribution('uniform', (-70, -50))>>> rd = RandomDistribution('normal', mu=0.5, sigma=0.1)>>> rng = NumpyRNG(seed=8658764)>>> rd = RandomDistribution('gamma', k=2.0, theta=5.0, rng=rng)

Available distributions:

Name Parameters Commentsbinomial n, pgamma k, thetaexponential betalognormal mu, sigmanormal mu, sigmanormal_clipped mu, sigma, low,

highValues outside (low, high) are redrawn

nor-mal_clipped_to_boundary

mu, sigma, low,high

Values below/above low/high are set to low/high

poissonlambda_

Trailing underscore since lambda is a Python key-word

uniform low, highuniform_int low, highvonmises mu, kappa

next(n=None, mask=None)Return n random numbers from the distribution.

lazily_evaluate(mask=None, shape=None)Generate an array of random numbers of the requested shape.

If a mask is given, produce only enough numbers to fill the region defined by the mask (hence ‘lazily’).

This method is called by the lazyarray evaluate() and _partially_evaluate() methods.

class NumpyRNG(seed=None, parallel_safe=True)Bases: pyNN.random.WrappedRNG

Wrapper for the numpy.random.RandomState class (Mersenne Twister PRNG).

translations = {'binomial': ('binomial', {'p': 'p', 'n': 'n'}), 'exponential': ('exponential', {'beta': 'scale'}), 'gamma': ('gamma', {'theta': 'scale', 'k': 'shape'}), 'lognormal': ('lognormal', {'mu': 'mean', 'sigma': 'sigma'}), 'normal': ('normal', {'mu': 'loc', 'sigma': 'scale'}), 'normal_clipped': ('normal_clipped', {'mu': 'mu', 'high': 'high', 'sigma': 'sigma', 'low': 'low'}), 'normal_clipped_to_boundary': ('normal_clipped_to_boundary', {'mu': 'mu', 'high': 'high', 'sigma': 'sigma', 'low': 'low'}), 'poisson': ('poisson', {'lambda_': 'lam'}), 'uniform': ('uniform', {'high': 'high', 'low': 'low'}), 'uniform_int': ('randint', {'high': 'high', 'low': 'low'}), 'vonmises': ('vonmises', {'mu': 'mu', 'kappa': 'kappa'})}

normal_clipped(mu=0.0, sigma=1.0, low=-inf, high=inf, size=None)

normal_clipped_to_boundary(mu=0.0, sigma=1.0, low=-inf, high=inf, size=None)

describe()

next(n=None, distribution=None, parameters=None, mask=None)Return n random numbers from the specified distribution.

If:

• n is None, return a float,

• n >= 1, return a Numpy array,

• n < 0, raise an Exception,

• n is 0, return an empty array.

19.1. API reference 173

Page 178: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

If called with distribution=None, returns uniformly distributed floats in the range [0, 1)

If mask is provided, it should be a boolean or integer NumPy array, indicating that only a subset of therandom numbers should be returned.

Example:

rng.next(5, mask=np.array([True, False, True, False, True]))

or:

rng.next(5, mask=np.array([0, 2, 4]))

will each return only three values.

If the rng is “parallel safe”, an array of n values will be drawn from the rng, and the mask applied. If therng is not parallel safe, the contents of the mask are disregarded, only its size (for an integer mask) or thenumber of True values (for a boolean mask) is used in determining how many values to draw.

class GSLRNG(seed=None, type=’mt19937’, parallel_safe=True)Bases: pyNN.random.WrappedRNG

Wrapper for the GSL random number generators.

translations = {'binomial': ('binomial', {'p': 'p', 'n': 'n'}), 'exponential': ('exponential', {'beta': 'mu'}), 'gamma': ('gamma', {'theta': 'theta', 'k': 'k'}), 'lognormal': ('lognormal', {'mu': 'zeta', 'sigma': 'sigma'}), 'normal': ('normal', {'mu': 'mu', 'sigma': 'sigma'}), 'normal_clipped': ('normal_clipped', {'mu': 'mu', 'high': 'high', 'sigma': 'sigma', 'low': 'low'}), 'poisson': ('poisson', {'lambda_': 'mu'}), 'uniform': ('flat', {'high': 'b', 'low': 'a'}), 'uniform_int': ('uniform_int', {'high': 'high', 'low': 'low'})}

uniform_int(low, high, size=None)

gamma(k, theta, size=None)

normal(mu=0.0, sigma=1.0, size=None)

normal_clipped(mu=0.0, sigma=1.0, low=-inf, high=inf, size=None)

describe()

next(n=None, distribution=None, parameters=None, mask=None)Return n random numbers from the specified distribution.

If:

• n is None, return a float,

• n >= 1, return a Numpy array,

• n < 0, raise an Exception,

• n is 0, return an empty array.

If called with distribution=None, returns uniformly distributed floats in the range [0, 1)

If mask is provided, it should be a boolean or integer NumPy array, indicating that only a subset of therandom numbers should be returned.

Example:

rng.next(5, mask=np.array([True, False, True, False, True]))

or:

rng.next(5, mask=np.array([0, 2, 4]))

will each return only three values.

174 Chapter 19. API reference

Page 179: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

If the rng is “parallel safe”, an array of n values will be drawn from the rng, and the mask applied. If therng is not parallel safe, the contents of the mask are disregarded, only its size (for an integer mask) or thenumber of True values (for a boolean mask) is used in determining how many values to draw.

class NativeRNG(seed=None)Bases: pyNN.random.AbstractRNG

Signals that the simulator’s own native RNG should be used. Each simulator module should implement a classof the same name which inherits from this and which sets the seed appropriately.

next(n=None, distribution=None, parameters=None, mask=None)Return n random numbers from the specified distribution.

If:

• n is None, return a float,

• n >= 1, return a Numpy array,

• n < 0, raise an Exception,

• n is 0, return an empty array.

If called with distribution=None, returns uniformly distributed floats in the range [0, 1)

If mask is provided, it should be a boolean or integer NumPy array, indicating that only a subset of therandom numbers should be returned.

Example:

rng.next(5, mask=np.array([True, False, True, False, True]))

or:

rng.next(5, mask=np.array([0, 2, 4]))

will each return only three values.

If the rng is “parallel safe”, an array of n values will be drawn from the rng, and the mask applied. If therng is not parallel safe, the contents of the mask are disregarded, only its size (for an integer mask) or thenumber of True values (for a boolean mask) is used in determining how many values to draw.

Adapting a different random number generator to work with PyNN

Todo: write this

19.1.9 Parameter handling

Note: these classes are not part of the PyNN API. They should not be used in PyNN scripts, they are intended forimplementing backends. You are not required to use them when implementing your own backend, however, as long asyour backend conforms to the API.

The main abstractions in PyNN are the population of neurons, and the set of connections (a ‘projection’) between twopopulations. Setting the parameters of individual neuron and synapse models, therefore, mainly takes place at the levelof populations and projections.

19.1. API reference 175

Page 180: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

Note: it is also possible to set the parameters of neurons and synapses individually, but this is generally less efficient.

Any model parameter in PyNN can be expressed as:

• a single value - all neurons in a population or synapses in a projection get the same value

• a RandomDistribution object - each element gets a value drawn from the random distribution

• a list/array of values of the same size as the population/projection

• a mapping function, where a mapping function accepts a either a single argument i (for a population) or twoarguments (i, j) (for a projection) and returns a single value.

A “single value” is usually a single number, but for some parameters (e.g. for spike times) it may be a list/array ofnumbers.

To handle all these possibilities in a uniform way, and at the same time allow for efficient parallelization, in the‘common’ implementation of the PyNN API all parameter values are converted into LazyArray objects, and the setof parameters for a model is contained in a dict-like object, ParameterSpace.

The LazyArray class

LazyArray is a PyNN-specific sub-class of a more general class, larray, and most of its functionality comesfrom the parent class. Full documentation for larray is available in the lazyarray package, but we give here a quickoverview.

LazyArray has three important features in the context of PyNN:

1. any operations on the array (potentially including array construction) are not performed immediately, but aredelayed until evaluation is specifically requested.

2. evaluation of only parts of the array is possible, which means that in a parallel simulation with MPI, all processeshave the same LazyArray for a parameter, but on a given process, only the part of the array which is neededfor the neurons/synapses that exist on that process need be evaluated.

3. single often all neurons in a population or synapses in a projection have the same value for a given parameter,a LazyArray created from a single value evaluates to that value: the full array is never created unless this isrequested.

For example, suppose we have two parameters, tau_m, which is constant, and v_thresh which varies according to theposition of the neuron in the population.

>>> from pyNN.parameters import LazyArray>>> tau_m = 2 * LazyArray(10.0, shape=(20,))>>> v_thresh = -55 + LazyArray(lambda i: 0.1*i, shape=(20,))

If we evaluate tau_m we get a full, homogeneous array:

>>> tau_m.evaluate()array([ 20., 20., 20., 20., 20., 20., 20., 20., 20., 20., 20.,

20., 20., 20., 20., 20., 20., 20., 20., 20.])

but we could also have asked just for the single number, in which case the full array would never be created:

>>> tau_m.evaluate(simplify=True)20.0

Similarly, we can evaluate v_thresh to get a normal NumPy array:

176 Chapter 19. API reference

Page 181: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

>>> v_thresh.evaluate()array([-55. , -54.9, -54.8, -54.7, -54.6, -54.5, -54.4, -54.3, -54.2,

-54.1, -54. , -53.9, -53.8, -53.7, -53.6, -53.5, -53.4, -53.3,-53.2, -53.1])

but we can also take, for example, only every fifth value, in which case the operation “add -55” only gets performedfor those elements.

>>> v_thresh[::5]array([-55. , -54.5, -54. , -53.5])

In this example the operation is very fast, but with slower operations (e.g. distance calculations) and large arrays, thetime savings can be considerable (see lazyarray performance).

In summary, by using LazyArray , we can pass parameters around in an optimised way without having to worryabout exactly what form the parameter value takes, hence avoiding a lot of logic at multiple points in the code.

Reference

class LazyArray(value, shape=None, dtype=None)Bases: lazyarray.larray

Optimises storage of arrays in various ways:

• stores only a single value if all the values in the array are the same

• if the array is created from a RandomDistribution or a function f(i,j), then elements are onlyevaluated when they are accessed. Any operations performed on the array are also queued up to beexecuted on access.

The main intention of the latter is to save memory for very large arrays by accessing them one row or column ata time: the entire array need never be in memory.

Arguments:

value: may be an int, long, float, bool, NumPy array, iterator, generator or a function, f(i) or f(i,j), depend-ing on the dimensions of the array. f(i,j) should return a single number when i and j are integers, and a1D array when either i or j or both is a NumPy array (in the latter case the two arrays must have equallengths).

shape: a tuple giving the shape of the array, or None

dtype: the NumPy dtype.

__getitem__(*args, **kwargs)Return one or more items from the array, as for NumPy arrays.

addr may be a single integer, a slice, a NumPy boolean array or a NumPy integer array.

by_column(mask=None)Iterate over the columns of the array. Columns will be yielded either as a 1D array or as a single value (fora flat array).

mask: either None or a boolean array indicating which columns should be included.

apply(f)Add the function f(x) to the list of the operations to be performed, where x will be a scalar or a numpyarray.

19.1. API reference 177

Page 182: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

>>> m = larray(4, shape=(2,2))>>> m.apply(numpy.sqrt)>>> m.evaluate()array([[ 2., 2.],

[ 2., 2.]])

check_bounds(*args, **kwargs)Check whether the given address is within the array bounds.

evaluate(*args, **kwargs)Return the lazy array as a real NumPy array.

If the array is homogeneous and simplify is True, return a single numerical value.

is_homogeneousTrue if all the elements of the array are the same.

ncolsSize of the second dimension (if it exists) of the array.

nrowsSize of the first dimension of the array.

shapeShape of the array

sizeTotal number of elements in the array.

The ParameterSpace class

ParameterSpace is a dict-like class that contains LazyArray objects.

In addition to the usual dict methods, it has several methods that allow operations on all the lazy arrays within it atonce. For example:

>>> from pyNN.parameters import ParameterSpace>>> ps = ParameterSpace({'a': [2, 3, 5, 8], 'b': 7, 'c': lambda i: 3*i+2}, shape=(4,))>>> ps['c']<larray: base_value=<function <lambda> at ...> shape=(4,) dtype=None, operations=[]>>>> ps.evaluate()>>> ps['c']array([ 2, 5, 8, 11])

the evaluate() method also accepts a mask, in order to evaluate only part of the lazy arrays:

>>> ps = ParameterSpace({'a': [2, 3, 5, 8, 13], 'b': 7, 'c': lambda i: 3*i+2},→˓shape=(5,))>>> ps.evaluate(mask=[1, 3, 4])>>> ps.as_dict(){'a': array([ 3, 8, 13]), 'c': array([ 5, 11, 14]), 'b': array([7, 7, 7])}

An example with two-dimensional arrays:

>>> ps2d = ParameterSpace({'a': [[2, 3, 5, 8, 13], [21, 34, 55, 89, 144]],... 'b': 7,... 'c': lambda i, j: 3*i-2*j}, shape=(2, 5))>>> ps2d.evaluate(mask=(slice(None), [1, 3, 4]))

(continues on next page)

178 Chapter 19. API reference

Page 183: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

(continued from previous page)

>>> print(ps2d['a'])[[ 3 8 13][ 34 89 144]]

>>> print(ps2d['c'])[[-2 -6 -8][ 1 -3 -5]]

There are also several methods to allow iterating over the parameter space in different ways. A ParameterSpacecan be viewed as both a dict contaning arrays and as an array of dicts. Iterating over a parameter space gives thelatter view:

>>> for D in ps:... print(D)...{'a': 3, 'c': 5, 'b': 7}{'a': 8, 'c': 11, 'b': 7}{'a': 13, 'c': 14, 'b': 7}

unlike for a dict, where iterating over it gives the keys. items() works as for a normal dict:

>>> for key, value in ps.items():... print(key, "=", value)a = [ 3 8 13]c = [ 5 11 14]b = [7 7 7]

Reference

class ParameterSpace(parameters, schema=None, shape=None, component=None)Representation of one or more points in a parameter space.

i.e. represents one or more parameter sets, where each parameter set has the same parameter names and typesbut the parameters may have different values.

Arguments:

parameters: a dict containing values of any type that may be used to construct a lazy array, i.e. int, float,NumPy array, RandomDistribution, function that accepts a single argument.

schema: a dict whose keys are the expected parameter names and whose values are the expected parametertypes

component: optional - class for which the parameters are destined. Used in error messages.

shape: the shape of the lazy arrays that will be constructed.

__getitem__(name)x.__getitem__(y) <==> x[y]

__iter__()Return an array-element-wise iterator over the parameter space.

Each item in the iterator is a dict, containing the same keys as the ParameterSpace. For the ith dictreturned by the iterator, each value is the ith element of the corresponding lazy array in the parameterspace.

Example:

19.1. API reference 179

Page 184: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

>>> ps = ParameterSpace({'a': [2, 3, 5, 8], 'b': 7, 'c': lambda i: 3*i+2},→˓shape=(4,))>>> ps.evaluate()>>> for D in ps:... print(D)...{'a': 2, 'c': 2, 'b': 7}{'a': 3, 'c': 5, 'b': 7}{'a': 5, 'c': 8, 'b': 7}{'a': 8, 'c': 11, 'b': 7}

shapeSize of the lazy arrays contained within the parameter space

keys()→ list of PS’s keys.

items()→ an iterator over the (key, value) items of PS.Note that the values will all be LazyArray objects.

update(**parameters)Update the contents of the parameter space according to the (key, value) pairs in **parameters. Allvalues will be turned into lazy arrays.

If the ParameterSpace has a schema, the keys and the data types of the values will be checked againstthe schema.

pop(name, d=None)Remove the given parameter from the parameter set and from its schema, and return its value.

is_homogeneousTrue if all of the lazy arrays within are homogeneous.

evaluate(mask=None, simplify=False)Evaluate all lazy arrays contained in the parameter space, using the given mask.

as_dict()Return a plain dict containing the same keys and values as the parameter space. The values must first havebeen evaluated.

columns()For a 2D space, return a column-wise iterator over the parameter space.

parallel_safe

has_native_rngsReturn True if the parameter set contains any NativeRNGs

expand(new_shape, mask)Increase the size of the ParameterSpace.

Existing array values are mapped to the indices given in mask. New array values are set to NaN.

The Sequence class

class Sequence(value)Represents a sequence of numerical values.

Arguments:

value: anything which can be converted to a NumPy array, or another Sequence object.

180 Chapter 19. API reference

Page 185: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

__mul__(val)Return a new ArrayParameter in which all values in the original ArrayParameter have beenmultiplied by val.

If val is itself an array, return an array of ArrayParameter objects, where ArrayParameter i is theoriginal ArrayParameter multiplied by element i of val.

__div__(val)Return a new ArrayParameter in which all values in the original ArrayParameter have beendivided by val.

If val is itself an array, return an array of ArrayParameter objects, where ArrayParameter i is theoriginal ArrayParameter divided by element i of val.

19.1.10 Spatial structure

Structure classes

Structure classes all inherit from the following base class, and inherit its methods:

class BaseStructureBases: object

get_parameters()Return a dict containing the parameters of the Structure.

describe(template=’structure_default.txt’, engine=’default’)Returns a human-readable description of the network structure.

The output may be customized by specifying a different template togther with an associated templateengine (see pyNN.descriptions).

If template is None, then a dictionary containing the template context will be returned.

generate_positions(n)Calculate and return the positions of n neurons positioned according to this structure.

class Line(dx=1.0, x0=0.0, y=0.0, z=0.0)Bases: pyNN.space.BaseStructure

Represents a structure with neurons distributed evenly on a straight line.

Arguments:

dx: distance between points in the line.

y, z,: y- and z-coordinates of all points in the line.

x0: x-coordinate of the first point in the line.

parameter_names = ('dx', 'x0', 'y', 'z')

generate_positions(n)Calculate and return the positions of n neurons positioned according to this structure.

class Grid2D(aspect_ratio=1.0, dx=1.0, dy=1.0, x0=0.0, y0=0.0, z=0, fill_order=’sequential’,rng=None)

Bases: pyNN.space.BaseStructure

Represents a structure with neurons distributed on a 2D grid.

Arguments:

dx, dy: distances between points in the x, y directions.

19.1. API reference 181

Page 186: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

x0, y0: coordinates of the starting corner of the grid.

z: the z-coordinate of all points in the grid.

aspect_ratio: ratio of the number of grid points per side (not the ratio of the side lengths, unless dx ==dy)

fill_order: may be ‘sequential’ or ‘random’

parameter_names = ('aspect_ratio', 'dx', 'dy', 'x0', 'y0', 'z', 'fill_order')

calculate_size(n)docstring goes here

generate_positions(n)Calculate and return the positions of n neurons positioned according to this structure.

class Grid3D(aspect_ratioXY=1.0, aspect_ratioXZ=1.0, dx=1.0, dy=1.0, dz=1.0, x0=0.0, y0=0.0, z0=0,fill_order=’sequential’, rng=None)

Bases: pyNN.space.BaseStructure

Represents a structure with neurons distributed on a 3D grid.

Arguments:

dx, dy, dz: distances between points in the x, y, z directions.

x0, y0. z0: coordinates of the starting corner of the grid.

aspect_ratioXY, aspect_ratioXZ: ratios of the number of grid points per side (not the ratio of the sidelengths, unless dx == dy == dz)

fill_order: may be ‘sequential’ or ‘random’.

If fill_order is ‘sequential’, the z-index will be filled first, then y then x, i.e. the first cell will be at (0,0,0) (givendefault values for the other arguments), the second at (0,0,1), etc.

parameter_names = ('aspect_ratios', 'dx', 'dy', 'dz', 'x0', 'y0', 'z0', 'fill_order')

calculate_size(n)docstring goes here

generate_positions(n)Calculate and return the positions of n neurons positioned according to this structure.

class RandomStructure(boundary, origin=(0.0, 0.0, 0.0), rng=None)Bases: pyNN.space.BaseStructure

Represents a structure with neurons distributed randomly within a given volume.

Arguments: boundary - a subclass of Shape. origin - the coordinates (x,y,z) of the centre of the volume.

parameter_names = ('boundary', 'origin', 'rng')

generate_positions(n)Calculate and return the positions of n neurons positioned according to this structure.

Shape classes

class Cuboid(width, height, depth)Bases: pyNN.space.Shape

Represents a cuboidal volume within which neurons may be distributed.

Arguments:

182 Chapter 19. API reference

Page 187: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

height: extent in y direction

width: extent in x direction

depth: extent in z direction

sample(n, rng)Return n points distributed randomly with uniform density within the cuboid.

class Sphere(radius)Bases: pyNN.space.Shape

Represents a spherical volume within which neurons may be distributed.

sample(n, rng)Return n points distributed randomly with uniform density within the sphere.

The Space class

class Space(axes=None, scale_factor=1.0, offset=0.0, periodic_boundaries=None)Class representing a space within distances can be calculated. The space is Cartesian, may be 1-, 2- or 3-dimensional, and may have periodic boundaries in any of the dimensions.

Arguments:

axes: if not supplied, then the 3D distance is calculated. If supplied, axes should be a string containingthe axes to be used, e.g. ‘x’, or ‘yz’. axes=’xyz’ is the same as axes=None.

scale_factor: it may be that the pre and post populations use different units for position, e.g. degrees andµm. In this case, scale_factor can be specified, which is applied to the positions in the post-synapticpopulation.

offset: if the origins of the coordinate systems of the pre- and post- synaptic populations are different,offset can be used to adjust for this difference. The offset is applied before any scaling.

periodic_boundaries: either None, or a tuple giving the boundaries for each dimension, e.g. ((x_min,x_max), None, (z_min, z_max)).

AXES = {None: [0, 1, 2], 'x': [0], 'xy': [0, 1], 'xyz': [0, 1, 2], 'xz': [0, 2], 'y': [1], 'yz': [1, 2], 'z': [2]}

distances(A, B, expand=False)Calculate the distance matrix between two sets of coordinates, given the topology of the current space.From http://projects.scipy.org/pipermail/numpy-discussion/2007-April/027203.html

distance_generator(f, g)

Implementing your own Shape

Todo: write this

Implementing your own Structure

Todo: write this

19.1. API reference 183

Page 188: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

19.1.11 Utility classes and functions

init_logging(logfile, debug=False, num_processes=1, rank=0, level=None)Simple configuration of logging.

get_simulator(*arguments)Import and return a PyNN simulator backend module based on command-line arguments.

The simulator name should be the first positional argument. If your script needs additional arguments, you canspecify them as (name, help_text) tuples. If you need more complex argument handling, you should use argparsedirectly.

Returns (simulator, command-line arguments)

class TimerFor timing script execution.

Timing starts on creation of the timer.

start()Start/restart timing.

elapsed_time(format=None)Return the elapsed time in seconds but keep the clock running.

If called with format="long", return a text representation of the time. Examples:

>>> timer.elapsed_time()987>>> timer.elapsed_time(format='long')16 minutes, 27 seconds

elapsedTime(**kwargs)Deprecated. Use elapsed_time() instead.

reset()Reset the time to zero, and start the clock.

diff(format=None)Return the time since the last time elapsed_time() or diff() was called.

If called with format='long', return a text representation of the time.

static time_in_words(s)Formats a time in seconds as a string containing the time in days, hours, minutes, seconds. Examples:

>>> Timer.time_in_words(1)1 second>>> Timer.time_in_words(123)2 minutes, 3 seconds>>> Timer.time_in_words(24*3600)1 day

mark(label)Store the time since the last time since the last time elapsed_time(), diff() or mark() was called,together with the provided label, in the attribute ‘marks’.

class ProgressBar(width=77, char=’#’, mode=’fixed’)Create a progress bar in the shell.

set_level(level)Rebuild the bar string based on level, which should be a number between 0 and 1.

184 Chapter 19. API reference

Page 189: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

notify(msg=’Simulation finished.’, subject=’Simulation finished.’, smtphost=None, address=None)Send an e-mail stating that the simulation has finished.

save_population(population, filename, variables=None)Saves the spike_times of a population and the size, structure, labels such that one can load it back into a Spike-SourceArray population using the load_population() function.

load_population(filename, sim)Loads a population that was saved with the save_population() function into SpikeSourceArray.

Basic plotting

class Figure(*panels, **options)Provide simple, declarative specification of multi-panel figures.

Example:

Figure(Panel(segment.filter(name="v")[0], ylabel="Membrane potential (mV)")Panel(segment.spiketrains, xlabel="Time (ms)"),title="Network activity",

).save("figure3.png")

Valid options are:

settings: for figure settings, e.g. {‘font.size’: 9}

annotations: a (multi-line) string to be printed at the bottom of the figure.

title: a string to be printed at the top of the figure.

save(filename)Save the figure to file. The format is taken from the file extension.

class Panel(*data, **options)Represents a single panel in a multi-panel figure.

A panel is a Matplotlib Axes or Subplot instance. A data item may be an AnalogSignal, AnalogSignal, or a listof SpikeTrains. The Panel will automatically choose an appropriate representation. Multiple data items may beplotted in the same panel.

Valid options are any valid Matplotlib formatting options that should be applied to the Axes/Subplot, plus inaddition:

data_labels: a list of strings of the same length as the number of data items.

line_properties: a list of dicts containing Matplotlib formatting options, of the same length as thenumber of data items.

plot(axes)Plot the Panel’s data in the provided Axes/Subplot instance.

comparison_plot(segments, labels, title=”, annotations=None, fig_settings=None, with_spikes=True)Given a list of segments, plot all the data they contain so as to be able to compare them.

Return a Figure instance.

19.1. API reference 185

Page 190: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

186 Chapter 19. API reference

Page 191: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

CHAPTER 20

Old documents

20.1 Standard models

Standard models are neuron models that are available in at least two of the simulation engines supported by PyNN.PyNN performs automatic translation of parameter names, types and units. Only a handful of models are currentlyavailable, but the list will be expanded in future releases. To obtain a list of all the standard models available in a givensimulator, use the list_standard_models() function, e.g.:

>>> from pyNN import neuron>>> neuron.list_standard_models()['IF_cond_alpha', 'IF_curr_exp', 'IF_cond_exp', 'EIF_cond_exp_isfa_ista','SpikeSourceArray', 'HH_cond_exp', 'IF_cond_exp_gsfa_grr','IF_facets_hardware1', 'SpikeSourcePoisson', 'EIF_cond_alpha_isfa_ista','IF_curr_alpha']

20.1.1 Neurons

IF_curr_alpha

Leaky integrate and fire model with fixed threshold and alpha-function-shaped post-synaptic current.

Availability: NEST, NEURON, Brian

187

Page 192: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

Name Default value Units Descriptionv_rest -65.0 mV Resting membrane potentialcm 1.0 nF Capacity of the membranetau_m 20.0 ms Membrane time constanttau_refrac 0.0 ms Duration of refractory periodtau_syn_E 5.0 ms Rise time of the excitatory synaptic alpha functiontau_syn_I 5.0 ms Rise time of the inhibitory synaptic alpha functioni_offset 0.0 nA Offset currentv_reset -65.0 mV Reset potential after a spikev_thresh -50.0 mV Spike threshold

IF_curr_exp

Leaky integrate and fire model with fixed threshold and decaying-exponential post-synaptic current. (Separate synapticcurrents for excitatory and inhibitory synapses.

Availability: NEST, NEURON, Brian

Name Default value Units Descriptionv_rest -65.0 mV Resting membrane potentialcm 1.0 nF Capacity of the membranetau_m 20.0 ms Membrane time constanttau_refrac 0.0 ms Duration of refractory periodtau_syn_E 5.0 ms Decay time of excitatory synaptic currenttau_syn_I 5.0 ms Decay time of inhibitory synaptic currenti_offset 0.0 nA Offset currentv_reset -65.0 mV Reset potential after a spikev_thresh -50.0 mV Spike threshold

IF_cond_alpha

Leaky integrate and fire model with fixed threshold and alpha-function-shaped post-synaptic conductance.

Availability: NEST, NEURON, Brian

Name Default value Units Descriptionv_rest -65.0 mV Resting membrane potentialcm 1.0 nF Capacity of the membranetau_m 20.0 ms Membrane time constanttau_refrac 0.0 ms Duration of refractory periodtau_syn_E 5.0 ms Rise time of the excitatory synaptic alpha functiontau_syn_I 5.0 ms Rise time of the inhibitory synaptic alpha functione_rev_E 0.0 mV Reversal potential for excitatory inpute_rev_I -70.0 mV Reversal potential for inhibitory inputv_thresh -50.0 mV Spike thresholdv_reset -65.0 mV Reset potential after a spikei_offset 0.0 nA Offset current

188 Chapter 20. Old documents

Page 193: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

IF_cond_exp

Leaky integrate and fire model with fixed threshold and decaying-exponential post-synaptic conductance.

Availability: NEST, NEURON, Brian

Name Default value Units Descriptionv_rest -65.0 mV Resting membrane potentialcm 1.0 nF Capacity of the membranetau_m 20.0 ms Membrane time constanttau_refrac 0.0 ms Duration of refractory periodtau_syn_E 5.0 ms Decay time of the excitatory synaptic conductancetau_syn_I 5.0 ms Decay time of the inhibitory synaptic conductancee_rev_E 0.0 mV Reversal potential for excitatory inpute_rev_I -70.0 mV Reversal potential for inhibitory inputv_thresh -50.0 mV Spike thresholdv_reset -65.0 mV Reset potential after a spikei_offset 0.0 nA Offset current

HH_cond_exp

Single-compartment Hodgkin-Huxley-type neuron with transient sodium and delayed-rectifier potassium currents us-ing the ion channel models from Traub.

Availability: NEST, NEURON, Brian

Name Default value Units Descriptiongbar_Na 20.0 uSgbar_K 6.0 uSg_leak 0.01 uScm 0.2 nFv_offset -63.0 mVe_rev_Na 50.0 mVe_rev_K -90.0 mVe_rev_leak -65.0 mVe_rev_E 0.0 mVe_rev_I -80.0 mVtau_syn_E 0.2 mstau_syn_I 2.0 msi_offset 0.0 nA

EIF_cond_alpha_isfa_ista

Adaptive exponential integrate and fire neuron according to Brette R and Gerstner W (2005) Adaptive Exponen-tial Integrate-and-Fire Model as an Effective Description of Neuronal Activity. J Neurophysiol 94:3637-3642

Availability: NEST, NEURON, Brian

20.1. Standard models 189

Page 194: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

Name Default value Units Descriptioncm 0.281 nF Capacity of the membranetau_refrac 0.0 ms Duration of refractory periodv_spike 0.0 mV Spike detection thresholdv_reset -70.6 mV Reset value for membrane potential after a spikev_rest -70.6 mV Resting membrane potential (Leak reversal potential)tau_m 9.3667 ms Membrane time constanti_offset 0.0 nA Offset currenta 4.0 nS Subthreshold adaptation conductanceb 0.0805 nA Spike-triggered adaptationdelta_T 2.0 mV Slope factortau_w 144.0 ms Adaptation time constantv_thresh -50.4 mV Spike initiation thresholde_rev_E 0.0 mV Excitatory reversal potentialtau_syn_E 5.0 ms Rise time of excitatory synaptic conductance (alpha function)e_rev_I -80.0 mV Inhibitory reversal potentialtau_syn_I 5.0 ms Rise time of the inhibitory synaptic conductance (alpha function)

20.1.2 Spike sources

SpikeSourcePoisson

Spike source, generating spikes according to a Poisson process.

Availability: NEST, NEURON, Brian

Name Default value Units Descriptionrate 0.0 s^‘-1‘ Mean spike frequencystart 0.0 ms Start timeduration 10^9 ms Duration of spike sequence

SpikeSourceArray

Spike source generating spikes at the times given in the spike_times array.

Availability: NEST, NEURON, Brian

Name Default value Units Descriptionspike_times [] ms list or numpy array containing spike times

190 Chapter 20. Old documents

Page 195: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

CHAPTER 21

Indices and tables

• genindex

• modindex

• search

191

Page 196: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

192 Chapter 21. Indices and tables

Page 197: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

Python Module Index

ppyNN.connectors, 159pyNN.parameters, 175pyNN.random, 172pyNN.space, 181

193

Page 198: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

194 Python Module Index

Page 199: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

Index

Symbols__div__() (Sequence method), 181__getitem__() (LazyArray method), 177__getitem__() (ParameterSpace method), 179__iter__() (ParameterSpace method), 179__mul__() (Sequence method), 180

AAdditivePotentiationMultiplicativeDepression (class in

pyNN.standardmodels.synapses), 171AdditiveWeightDependence (class in

pyNN.standardmodels.synapses), 171AllToAllConnector (class in pyNN.connectors), 160apply() (LazyArray method), 177ArrayConnector (class in pyNN.connectors), 161as_dict() (ParameterSpace method), 180AXES (Space attribute), 183

BBaseStructure (class in pyNN.space), 181by_column() (LazyArray method), 177

Ccalculate_size() (Grid2D method), 182calculate_size() (Grid3D method), 182check_bounds() (LazyArray method), 178CloneConnector (class in pyNN.connectors), 163columns() (ParameterSpace method), 180comparison_plot() (in module pyNN.utility.plotting), 185computed_parameters() (StandardCellType method), 164computed_parameters() (StandardSynapseType method),

168conductance_based (EIF_cond_alpha_isfa_ista attribute),

166conductance_based (EIF_cond_exp_isfa_ista attribute),

166conductance_based (IF_cond_alpha attribute), 165conductance_based (IF_cond_exp attribute), 164

conductance_based (IF_cond_exp_gsfa_grr attribute),167

conductance_based (IF_curr_alpha attribute), 165conductance_based (IF_curr_exp attribute), 165conductance_based (Izhikevich attribute), 166conductance_based (SpikeSourceArray attribute), 167conductance_based (SpikeSourceInhGamma attribute),

167conductance_based (SpikeSourcePoisson attribute), 167connect() (Connector method), 159connection_type (StandardSynapseType attribute), 168Connector (class in pyNN.connectors), 159CSAConnector (class in pyNN.connectors), 162Cuboid (class in pyNN.space), 182

Ddefault_initial_values (EIF_cond_alpha_isfa_ista at-

tribute), 166default_initial_values (EIF_cond_exp_isfa_ista attribute),

166default_initial_values (IF_cond_alpha attribute), 165default_initial_values (IF_cond_exp attribute), 164default_initial_values (IF_cond_exp_gsfa_grr attribute),

167default_initial_values (IF_curr_alpha attribute), 165default_initial_values (IF_curr_exp attribute), 165default_initial_values (Izhikevich attribute), 166default_initial_values (StandardSynapseType attribute),

168default_initial_values (TsodyksMarkramSynapse at-

tribute), 170default_parameters (AdditivePotentiationMultiplica-

tiveDepression attribute), 171default_parameters (AdditiveWeightDependence at-

tribute), 171default_parameters (EIF_cond_alpha_isfa_ista attribute),

166default_parameters (EIF_cond_exp_isfa_ista attribute),

166

195

Page 200: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

default_parameters (GutigWeightDependence attribute),171

default_parameters (IF_cond_alpha attribute), 165default_parameters (IF_cond_exp attribute), 164default_parameters (IF_cond_exp_gsfa_grr attribute),

167default_parameters (IF_curr_alpha attribute), 165default_parameters (IF_curr_exp attribute), 165default_parameters (Izhikevich attribute), 166default_parameters (MultiplicativeWeightDependence at-

tribute), 171default_parameters (SpikePairRule attribute), 172default_parameters (SpikeSourceArray attribute), 167default_parameters (SpikeSourceInhGamma attribute),

168default_parameters (SpikeSourcePoisson attribute), 167default_parameters (StandardSynapseType attribute), 168default_parameters (StaticSynapse attribute), 169default_parameters (TsodyksMarkramSynapse attribute),

169describe() (BaseStructure method), 181describe() (Connector method), 159describe() (GSLRNG method), 174describe() (NumpyRNG method), 173describe() (SpikePairRule method), 172describe() (StandardCellType method), 164describe() (StandardSynapseType method), 168describe() (STDPMechanism method), 170describe() (STDPWeightDependence method), 170diff() (Timer method), 184DisplacementDependentProbabilityConnector (class in

pyNN.connectors), 162distance_generator() (Space method), 183DistanceDependentProbabilityConnector (class in

pyNN.connectors), 162distances() (Space method), 183

EEIF_cond_alpha_isfa_ista (class in

pyNN.standardmodels.cells), 166EIF_cond_exp_isfa_ista (class in

pyNN.standardmodels.cells), 166elapsed_time() (Timer method), 184elapsedTime() (Timer method), 184environment variable

PYTHONPATH, 152evaluate() (LazyArray method), 178evaluate() (ParameterSpace method), 180expand() (ParameterSpace method), 180extra_parameters (StandardSynapseType attribute), 169

FFigure (class in pyNN.utility.plotting), 185

FixedNumberPostConnector (class in pyNN.connectors),161

FixedNumberPreConnector (class in pyNN.connectors),161

FixedProbabilityConnector (class in pyNN.connectors),160

FixedTotalNumberConnector (class in pyNN.connectors),162

FromFileConnector (class in pyNN.connectors), 160FromListConnector (class in pyNN.connectors), 160

Ggamma() (GSLRNG method), 174generate_positions() (BaseStructure method), 181generate_positions() (Grid2D method), 182generate_positions() (Grid3D method), 182generate_positions() (Line method), 181generate_positions() (RandomStructure method), 182get_native_names() (StandardCellType method), 164get_native_names() (StandardSynapseType method), 169get_parameter_names() (pyNN.standardmodels.StandardCellType

class method), 164get_parameter_names() (pyNN.standardmodels.StandardSynapseType

class method), 169get_parameter_names() (STDPMechanism method), 170get_parameters() (BaseStructure method), 181get_parameters() (Connector method), 159get_schema() (StandardCellType method), 164get_schema() (StandardSynapseType method), 168get_schema() (STDPMechanism method), 170get_simulator() (in module pyNN.utility), 184Grid2D (class in pyNN.space), 181Grid3D (class in pyNN.space), 182GSLRNG (class in pyNN.random), 174GutigWeightDependence (class in

pyNN.standardmodels.synapses), 171

Hhas_native_rngs (ParameterSpace attribute), 180has_parameter() (pyNN.standardmodels.StandardCellType

class method), 164has_parameter() (pyNN.standardmodels.StandardSynapseType

class method), 169has_parameter() (STDPMechanism method), 170has_presynaptic_components (StandardSynapseType at-

tribute), 169

IIF_cond_alpha (class in pyNN.standardmodels.cells), 164IF_cond_exp (class in pyNN.standardmodels.cells), 164IF_cond_exp_gsfa_grr (class in

pyNN.standardmodels.cells), 166IF_curr_alpha (class in pyNN.standardmodels.cells), 165IF_curr_exp (class in pyNN.standardmodels.cells), 165

196 Index

Page 201: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

IndexBasedProbabilityConnector (class inpyNN.connectors), 162

init_logging() (in module pyNN.utility), 184injectable (EIF_cond_alpha_isfa_ista attribute), 166injectable (EIF_cond_exp_isfa_ista attribute), 166injectable (IF_cond_alpha attribute), 164injectable (IF_cond_exp attribute), 164injectable (IF_cond_exp_gsfa_grr attribute), 167injectable (IF_curr_alpha attribute), 165injectable (IF_curr_exp attribute), 165injectable (Izhikevich attribute), 165injectable (SpikeSourceArray attribute), 167injectable (SpikeSourceInhGamma attribute), 167, 168injectable (SpikeSourcePoisson attribute), 167is_homogeneous (LazyArray attribute), 178is_homogeneous (ParameterSpace attribute), 180items() (ParameterSpace method), 180Izhikevich (class in pyNN.standardmodels.cells), 165

Kkeys() (ParameterSpace method), 180

Llazily_evaluate() (RandomDistribution method), 173LazyArray (class in pyNN.parameters), 177Line (class in pyNN.space), 181load_population() (in module pyNN.utility), 185

Mmark() (Timer method), 184model (STDPMechanism attribute), 170MultiplicativeWeightDependence (class in

pyNN.standardmodels.synapses), 171

Nnative_parameters (StandardSynapseType attribute), 169native_parameters (STDPMechanism attribute), 170NativeRNG (class in pyNN.random), 175ncols (LazyArray attribute), 178next() (GSLRNG method), 174next() (NativeRNG method), 175next() (NumpyRNG method), 173next() (RandomDistribution method), 173normal() (GSLRNG method), 174normal_clipped() (GSLRNG method), 174normal_clipped() (NumpyRNG method), 173normal_clipped_to_boundary() (NumpyRNG method),

173notify() (in module pyNN.utility), 184nrows (LazyArray attribute), 178NumpyRNG (class in pyNN.random), 173

OOneToOneConnector (class in pyNN.connectors), 160

PPanel (class in pyNN.utility.plotting), 185parallel_safe (ParameterSpace attribute), 180parameter_checks (StandardSynapseType attribute), 168parameter_names (Grid2D attribute), 182parameter_names (Grid3D attribute), 182parameter_names (Line attribute), 181parameter_names (RandomStructure attribute), 182parameter_space (STDPMechanism attribute), 170ParameterSpace (class in pyNN.parameters), 179plot() (Panel method), 185pop() (ParameterSpace method), 180possible_models (STDPMechanism attribute), 170ProgressBar (class in pyNN.utility), 184pyNN.connectors (module), 159pyNN.parameters (module), 175pyNN.random (module), 172pyNN.space (module), 181PYTHONPATH, 152

RRandomDistribution (class in pyNN.random), 172RandomStructure (class in pyNN.space), 182receptor_types (SpikeSourceArray attribute), 167receptor_types (SpikeSourceInhGamma attribute), 168receptor_types (SpikeSourcePoisson attribute), 167recordable (EIF_cond_alpha_isfa_ista attribute), 166recordable (EIF_cond_exp_isfa_ista attribute), 166recordable (IF_cond_alpha attribute), 165recordable (IF_cond_exp attribute), 164recordable (IF_cond_exp_gsfa_grr attribute), 167recordable (IF_curr_alpha attribute), 165recordable (IF_curr_exp attribute), 165recordable (Izhikevich attribute), 166recordable (SpikeSourceArray attribute), 167recordable (SpikeSourceInhGamma attribute), 168recordable (SpikeSourcePoisson attribute), 167reset() (Timer method), 184reverse_translate() (StandardCellType method), 164reverse_translate() (StandardSynapseType method), 169

Ssample() (Cuboid method), 183sample() (Sphere method), 183save() (Figure method), 185save_population() (in module pyNN.utility), 185scaled_parameters() (StandardCellType method), 164scaled_parameters() (StandardSynapseType method), 169Sequence (class in pyNN.parameters), 180set_level() (ProgressBar method), 184shape (LazyArray attribute), 178shape (ParameterSpace attribute), 180simple_parameters() (StandardCellType method), 164

Index 197

Page 202: PyNN Documentation · CHAPTER 1 Introduction PyNN(pronounced ‘pine’) is a simulator-independent language for building neuronal network models. In other words, you can write the

PyNN Documentation, Release 0.9.3

simple_parameters() (StandardSynapseType method),169

size (LazyArray attribute), 178SmallWorldConnector (class in pyNN.connectors), 162Space (class in pyNN.space), 183Sphere (class in pyNN.space), 183SpikePairRule (class in pyNN.standardmodels.synapses),

172SpikeSourceArray (class in pyNN.standardmodels.cells),

167SpikeSourceInhGamma (class in

pyNN.standardmodels.cells), 167SpikeSourcePoisson (class in

pyNN.standardmodels.cells), 167StandardCellType (class in pyNN.standardmodels), 163StandardSynapseType (class in pyNN.standardmodels),

168start() (Timer method), 184StaticSynapse (class in pyNN.standardmodels.synapses),

169STDPMechanism (class in

pyNN.standardmodels.synapses), 170STDPTimingDependence (class in

pyNN.standardmodels), 172STDPWeightDependence (class in

pyNN.standardmodels), 170

Ttime_in_words() (Timer static method), 184Timer (class in pyNN.utility), 184translate() (StandardCellType method), 164translate() (StandardSynapseType method), 169translations (GSLRNG attribute), 174translations (NumpyRNG attribute), 173translations (StandardSynapseType attribute), 169TsodyksMarkramSynapse (class in

pyNN.standardmodels.synapses), 169

Uuniform_int() (GSLRNG method), 174units (EIF_cond_alpha_isfa_ista attribute), 166units (EIF_cond_exp_isfa_ista attribute), 166units (IF_cond_alpha attribute), 165units (IF_cond_exp attribute), 164units (IF_cond_exp_gsfa_grr attribute), 167units (IF_curr_alpha attribute), 165units (IF_curr_exp attribute), 165units (Izhikevich attribute), 166units (SpikeSourceArray attribute), 167units (SpikeSourceInhGamma attribute), 168units (SpikeSourcePoisson attribute), 167update() (ParameterSpace method), 180

Vvoltage_based_synapses (Izhikevich attribute), 166

198 Index