Skip to main content
  • Poster presentation
  • Open access
  • Published:

Reservoir computing methods for functional identification of biological networks

The complexity of biological neural networks (BNN) necessitates automated methods for investigating their stimulus-response and structure-dynamics relations. In the present work, we aim at building a functionally equivalent network to a reference BNN. The response signal of the BNN to various input streams is regarded as a characterization of its function. Therefore, we train an artificial system that imitates the input-output relation of the reference BNN under the applied stimulus range. In other words, we take a system identification approach for biological neural networks. Generic network models with fixed random connectivity, recurrent dynamics and fading memory, reservoirs, were shown to have a strong separation property on various input streams. Equipped with additional simple readout units, such systems have been successfully applied to several nonlinear modeling and engineering tasks [1].

Here we take a reservoir computing approach for functional identification of simulated random BNNs and neuronal cell cultures [2]. More specifically, we utilize an Echo State Network (ESN) of leaky integrator (non-spiking) neurons with sigmoid activation functions to identify a BNN. We propose algorithms to adapt the ESN parameters for modeling the relations between continuous input streams and multi-unit recordings in BNNs. Our findings indicate that the trained ESNs can imitate the response signal of a reference biological network for several tasks. For instance, we trained an ESN to estimate the instantaneous firing rate (conditional intensity) of a randomly selected neuron in a simulated BNN. Receiver Operating Characteristic (ROC) curve analysis showed that the ESN can estimate the conditional intensity of this selected neuron (see Figure 1).

Figure 1
figure 1

Estimated conditional intensity for a selected biological neural network. Conditional intensity estimations, λ, for all time steps in the testing period are shown in decreasing order (top). A bar is shown if there was indeed a spike observed in the corresponding time step (top). Distributions of conditional intensity for time steps with observed spikes and without spikes (middle). By a varying threshold on λ, true positive rates vs. false positive rates can be calculated (bottom).


  1. Jaeger H: The "echo state" approach to analysing and training recurrent neural networks. GMD Report 148. 2001, GMD – German National Research Institute for Computer Science

    Google Scholar 

  2. Marom S, Shahaf G: Development, learning and memory in large random networks of cortical neurons: Lessons beyond anatomy. Q Rev Biophys. 2002, 35: 63-87.

    Article  PubMed  Google Scholar 

Download references


This work was supported by the German BMBF (BCCN Freiburg, 01GQ0420) and the European Community (NEURO no 12788).

Author information

Authors and Affiliations


Corresponding author

Correspondence to Tayfun Gürelu.

Rights and permissions

Open Access This article is published under license to BioMed Central Ltd. This is an Open Access article is distributed under the terms of the Creative Commons Attribution 2.0 International License (, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Gürelu, T., Rotter, S. & Egert, U. Reservoir computing methods for functional identification of biological networks. BMC Neurosci 10 (Suppl 1), P293 (2009).

Download citation

  • Published:

  • DOI: