Skip to main content
  • Poster presentation
  • Open access
  • Published:

Motion-based predictive coding is sufficient to solve the aperture problem

It is still unclear how information collected locally by low-level sensory neurons may give rise to a coherent global percept. This is well demonstrated in the aperture problem both in visual or haptic senses. Experimental findings on its biological solution in area MT show that local motion measures are integrated to see the dynamical emergence of global motion information [1]. We develop a theory of spatio-temporal integration defined as implementing motion-based predictive coding. This takes the form of an anisotropic, context-dependent diffusion of local information [2]. Here, we test this functional model for the aperture problem in the visual and haptic low-level sensory areas.

In our model, spatial and motion information is represented in a probabilistic framework. Information is pooled using a Markov chain formulation, merging current information and measurement likelihood thanks to a prior on motion transition. This prior is defined so that it is adapted to smooth trajectories such as are observed in natural environments.This dynamical system favors temporally coherent features. Differently to neural approximations [3], we use a particle filtering method to implement this functional model. This generalizes Kalman filtering approaches that were used previously by allowing to represent non-gaussian and multimodal distributions.

We observe the emergence of mechanisms that reflect observations made at psychophysical and behavioral levels. First, the dynamical system shows the emergence of the solution to the aperture problem and show dependence to line’s length [4]. Then,when presented with an object with a regular translation, the dynamical system grabs itsmotion independently of its shape and exhibits motion extrapolation. This shows that prediction is sufficient for the dynamical build-up of information from a local to a global scale. More generally it may give insights in the role of spatio-temporal integration on neural dynamics in the emergence of properties that are accredited to low-level sensory computations.


  1. Smith MA, Majaj N, Movshon JA: Dynamics of pattern motion computation. Dynamics of Visual Motion Processing: Neuronal, Behavioral and Computational Approaches. Edited by: G. S. Masson and U. J. Ilg. 2010, Springer, 55-72.

    Google Scholar 

  2. Watamaniuk S, McKee S, Grzywacz N: Detecting a trajectory embedded in random-direction motion noise. Vision research. 1995, 35 (1): 65-77. 10.1016/0042-6989(94)E0047-O.

    Article  CAS  PubMed  Google Scholar 

  3. Burgi PY, Yuille AL, Grzywacz N: Probabilistic motion estimation based on temporal coherence. Neural Computation. 2000, 12 (8): 1839-1867.

    Google Scholar 

  4. Castet E, Lorenceau J, Bonnet C: The inverse intensity effect is not lost with stimuli in apparent motion. Vision Research. 1993, 33 (12): 1697-1708. 10.1016/0042-6989(93)90035-U.

    Article  CAS  PubMed  Google Scholar 

Download references

Author information

Authors and Affiliations


Corresponding author

Correspondence to Mina A Khoei.

Rights and permissions

This article is published under license to BioMed Central Ltd. This is an open access article distributed under the terms of the Creative Commons Attribution License (, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Khoei, M.A., Perrinet, L.U. & Masson, G.S. Motion-based predictive coding is sufficient to solve the aperture problem. BMC Neurosci 12 (Suppl 1), P279 (2011).

Download citation

  • Published:

  • DOI: