- Poster presentation
Influence of different types of downscaling on a cortical microcircuit model
BMC Neurosciencevolume 14, Article number: P112 (2013)
Neural network models are routinely downscaled in terms of numbers of neurons or synapses because of a lack of computational resources or the limited capacity of a given neuromorphic hardware. Unfortunately the downscaling is often performed without explicit mention of the limitations this entails . This is relevant since downscaling can substantially affect the dynamics. For instance, reducing the number of neurons N while preserving in-degrees K increases shared inputs and hence correlations.
Theoretical results on scaling are derived using simplifying assumptions. Therefore we use simulations to systematically investigate the effects of downscaling on the dynamics of a layered microcircuit model of early sensory cortex . The model consists of eight excitatory and inhibitory populations of leaky integrate-and-fire point neurons with current-based synapses and homogeneous Poisson input. The full-scale network, comprising approximately 80,000 neurons and 0.3 billion synapses, is in the balanced state [3, 4], displaying asynchronous irregular activity. For networks of binary or integrate-and-fire neurons in this state, the total current-based synaptic input to a given neuron is well approximated by a Gaussian noise, of which the mean and variance determine the firing rate. One way of preserving this mean and variance has been touched upon: reducing N but keeping K constant. For reduced K, firing rates can be preserved by an appropriate choice of synaptic weights, the ratio between excitatory and inhibitory weights, and mean external input [3, 4].
However, these scaling methods are based on single-neuron dynamics and ignore network effects. For instance, scaling synaptic weights J and mean external input to preserve rates implies constant J2 K , which changes the feedback on the population level and hence qualitatively changes the network state . An alternative that takes into account both single-neuron and network effects is to choose J so that J K is constant and adjusting the external drive to maintain the input variance. This preserves the shape of cross-correlation functions in balanced E-I networks . However, the magnitude of the correlations increases as K decreases, and a restriction to positive external input variance limits the compensation for increased intrinsic variance.
We consider the effects of the above scaling options on correlations, firing rates, synchrony, and irregularity of spiking activity. Although each method achieves near-constancy of one or more dynamical features, none preserves all of them. These results can be used to downscale networks of neurons with current-based synapses in the asynchronous irregular regime in a controlled manner. However, since it is not yet possible to analytically predict all the effects of scaling, it remains essential to compare down-scaled simulations with their full-scale counterparts.
Crook SM, Bednar JA, Berger S, Cannon R, Davison AP, Djurfeldt M, Eppler JM, Kriener B, Furber S, Graham B, et al: Creating, documenting and sharing network models. Network: Comput Neural Syst. 2012, 131-149. 23
Potjans TC, Diesmann M: The cell-type specific cortical microcircuit: relating structure and activity in a full-scale spiking network model. Cereb Cortex. 2012, doi: 10.1093/cercor/bhs358
van Vreeswijk C, Sompolinsky H: Chaotic balanced state in a model of cortical circuits. Neural Comput. 1998, 10: 1321-1371. 10.1162/089976698300017214.
Brunel N: Dynamics of sparsely connected networks of excitatory and inhibitory spiking neurons. J Comp Neurosci. 2000, 8: 183-208. 10.1023/A:1008925309027.
Helias M, Tetzlaff T, Diesmann M: Echoes in correlated neural systems. New J Phys. 2013, 15: 023002-10.1088/1367-2630/15/2/023002.
We acknowledge funding by the Helmholtz Association: HASB and portfolio theme SMHB, the Next-Generation Supercomputer Project of MEXT, EU Grant 269921 (BrainScaleS). All network simulations were carried out using PyNN (http://neuralensemble.org/PyNN) with the NEST back-end (http://www.nest-initiative.org).