Skip to main content
  • Poster presentation
  • Open access
  • Published:

Neural network model for sequence learning based on phase precession

Introduction

We have developed a network structure, inspired by experimental data on the neural correlates of navigation in the hippocampus [1], that is able to learn and recognize multiple overlapping sequences of symbols. The network relies on an analog of phase precession, the phenomenon in which place cells fire at progressively earlier phases with respect to a local field potential oscillation as the animal moves further into the place field. Following [1], we suppose phase precession serves to compress temporal sequences, ensuring both the appropriate timescale for synaptic plasticity and robustness to variation in symbol presentation rate. One model proposes that phase precession results from the interaction of rhythmic inhibition with excitatory input [2]. We investigate how different assumptions about the nature of the excitatory input and variation in other model parameters affect the robustness of phase precession and sequence learning.

Model

The network consists of pools of leaky integrate-and-fire neurons, with one such pool for each symbol. All pools receive identical periodic inhibition. When a symbol occurs it is represented by excitatory input to the corresponding pool. The input is initially weak, then increases, before ceasing. The periodic inhibition combines with the increasing input strength to result in phase precession and sequence compression: during each cycle of the inhibition, neurons corresponding to recent symbols fire in the correct order.

Sequence learning occurs because synapses between pools corresponding to consecutive symbols are strengthened by spike-timing dependent plasticity. Competitive heterosynaptic plasticity causes specialization of neurons to particular sequences. Neurons that receive strengthened connections from the previously active pool fire with reduced latency and recurrent inhibition then prevents the other neurons of the pool from firing at all.

Results

The rate, regularity and synaptic time courses of excitatory inputs are shown to affect the reliability and precision of phase precession. The ability of the network to learn and store multiple sequences is demonstrated and its sensitivity to variability in phase precession is characterized; in particular, the ability to learn infrequent sequence branches is shown to depend strongly on phase variability. Robustness of network behavior with respect to variation in other parameters is also described.

References

  1. Skaggs WE, McNaughton BL, Wilson BL, Barnes CA: Theta phase precession in hippocampal neuronal populations and the compression of temporal sequences. Hippocampus. 1996, 6: 149-172. 10.1002/(SICI)1098-1063(1996)6:2<149::AID-HIPO6>3.0.CO;2-K.

    Article  CAS  PubMed  Google Scholar 

  2. Mehta MR, Lee AK, Wilson MA: Role of experience and oscillations in transforming a rate code into a temporal code. Nature. 2002, 417: 741-746. 10.1038/nature00807.

    Article  CAS  PubMed  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sean Byrnes.

Rights and permissions

Open Access This article is published under license to BioMed Central Ltd. This is an Open Access article is distributed under the terms of the Creative Commons Attribution 2.0 International License (https://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Byrnes, S., Burkitt, A.N., Meffin, H. et al. Neural network model for sequence learning based on phase precession. BMC Neurosci 10 (Suppl 1), P259 (2009). https://doi.org/10.1186/1471-2202-10-S1-P259

Download citation

  • Published:

  • DOI: https://doi.org/10.1186/1471-2202-10-S1-P259

Keywords