Volume 11 Supplement 1

Nineteenth Annual Computational Neuroscience Meeting: CNS*2010

Open Access

Top-down connections as abstract and temporal context for self-organization and categorization

BMC Neuroscience201011(Suppl 1):P83

DOI: 10.1186/1471-2202-11-S1-P83

Published: 20 July 2010

There are many bottom-up connections from inferior temporal cortex (ITC) to prefrontal cortex (PFC) and many top-down connections from PFC to ITC. It is hypothesized that neurons in PFC perform behaviorally-relevant category binding, while neurons in ITC are responsive to high-level visual features [1]. Therefore it is thought the bottom-up connections provide information about detected high-level features to PFC, which binds them together for categorization. But all the computational roles of the top-down connections are currently not known, specifically in development and learning.

We seek to verify two general ideas about top-down connections via simulation: that they could act as the impetus of category-specific self-organization, e.g., seen in the fusiform face area (FFA) and parahippocampal place area (PPA), and that they can act as a “bias” (memory store) for biasing the ITC features [2]. We built networks with three interconnected neuronal layers: a sensory area (layer one), a feature representation area (layer-two: ITC), and a category-behavior area (layer-three: PFC). Each layer-two neuron receives excitatory inputs from the bottom-up, laterally, and top-down and each layer-three neuron has bottom-up inputs. Neurons compete with others on the same layer through lateral inhibition. Neurons that are not firing-inhibited learn through a Hebbian learning algorithm [3], in which the strength of synaptic learning is based on presynaptic and postsynaptic potentials.

We used visual stimuli (images) of 26 classes of objects that sequentially rotate in depth, presenting them randomly, and learning in a supervised fashion. We observed two results: developed ITC neurons became more specialized to represent a particular class and grouped class areas emerged on the ITC neuronal plane. In contrast to slow feature analysis methods [4], our network’s development does not depend on slowly changing inputs, but instead the correlations between the category information from PFC and the true class of the stimulus. This might explain how PPA, which represents “places” --- a very abstract category containing members that are experienced at different times --- could develop. Networks with top-down connections showed an average error rate reduction of 63% over networks where top-down connections were disabled. The classification rate of a 40 x 40 layer-two was 95% recognition when operating in feedforward. When also utilizing top-down connections, performance reached nearly 100%, but errors were introduced during the periods when an object transitioned to another (yet the network successfully recovered). This is consistent with the idea that top-down connections can bias lower-level features based on currently detected categories.

Authors’ Affiliations

Computer Science and Engineering, Michigan State University
Neuroscience Program, Michigan State University


  1. Freedman DJ, Riesenhuber M, Poggio T, Miller EK: A comparison of primate prefrontal and inferior temporal cortices during visual categorization. J Neurosci. 2003, 23 (12): 5235-5246.PubMedGoogle Scholar
  2. Tomita H, Ohbayashi M, Nakahara K, Hasegawa I, Miyashita Y: Top-down signal from prefrontal cortex in executive control of memory retrieval. Nature. 1999, 401 (6754): 699-703. 10.1038/44372.View ArticlePubMedGoogle Scholar
  3. Weng J, Luciw M: Dually optimal neuronal layers: lobe component analysis. IEEE Trans Autonomous Mental Development. 2009, 1 (1): 68-85. 10.1109/TAMD.2009.2021698.View ArticleGoogle Scholar
  4. Franzius M, Wilbert N, Wiskott L: Invariant object recognition with slow feature analysis. In Proc. Int'l Conf. on Artificial Neural Networks. 2008, 3-6.Google Scholar


© Luciw and Weng; licensee BioMed Central Ltd. 2010

This article is published under license to BioMed Central Ltd.