4. Examples

This section provides a list of the sample models provided in the examples/ directory of the source code.

4.1. Rate-coded networks

  • Neural Field: a simple model using neural field recurrent networks. This is a very simple rate-coded model without learning.
  • Bar Learning problem: an implementation of the bar learning problem, illustrating synaptic plasticity in rate-coded networks.
  • Image Processing and Webcam: shows how to use the ImagePopulation and VideoPopulation classes of the image extension to clamp directly images and video streams into a rate-coded network. Also demonstrate the weightsharing extension.
  • Structural plasticity: a dummy example using structural plasticity.
  • Multiple networks: shows how to use multiple networks and call parallel_run to run several networks in parallel.

4.2. Spiking networks

Simple networks

  • Izhikevich’s pulse-coupled network: an implementation of the simple pulse-coupled network described in (Izhikevich, 2003). It shows how to build a simple spiking network without synaptic plasticity.
  • Gap Junctions: an example using gap junctions.
  • Hodgkin Huxley neuron: a single Hodgkin-Huxley neuron.
  • A collection of Brain/PyNN/NEST model reproductions in the folder examples/pyNN.

Complex networks

  • COBA and CUBA networks: an implementation of the balanced network described in (Vogels and Abbott, 2005). It shows how to build a simple spiking network using integrate-and-fire neurons and sparse connectivity.
  • Short-Term Plasticity and Synchrony: an example of short-term plasticity based on the model of Tsodyks, Uziel and Markram (2000). Synchrony Generation in Recurrent Networks with Frequency-Dependent Synapses. The Journal of Neuroscience.

With synaptic plasticity

  • Simple STDP: a simple example using spike-timing dependent plasticity (STDP).
  • Homeostatic STDP: an example of homeostatic STDP based on the model of Carlson, Richert, Dutt and Krichmar (2013). Biologically plausible models of homeostasis and STDP: Stability and learning in spiking neural networks. IJCNN.