Volume 13 Supplement 1

Twenty First Annual Computational Neuroscience Meeting: CNS*2012

Open Access

A visual code book-structured probability distributions in natural scenes

  • Weibing Wan1 and
  • Zhiyong Yong1, 2, 3
BMC Neuroscience201213(Suppl 1):P9

DOI: 10.1186/1471-2202-13-S1-P9

Published: 16 July 2012

Natural visual scenes consist of objects of various physical properties that are arranged in three-dimensional (3D) space in a variety of ways. When projected onto the retina, visual scenes entail highly structured statistics, occurring over the full range of natural variations in the world. To deal efficiently with this full range of natural variations, the visual system may have to generate percepts according to the probability distributions (PDs) of visual variables underlying any stimulus [15]. What, then, are these PDs in natural scenes? In this work, we proposed that these PDs are the components of the grand joint PD of the physical world and the images on the retina (GPDWI). Thus, our approach is to decompose GPDWI into a large set of PDs without information reduction. We call these PDs a visual code book.

To examine this visual code book, we first sampled a large number of scene patches (~2 degrees of visual angle) from a database of high-resolution 3D natural scenes and fitted a concatenation of 8th order polynomial functions to the 2D and 3D data in the patches. Using the fitted polynomial functions, we classified all the natural scene patches into a large set of 2D-3D natural scene structures that have distinctive distributions of ranges and/or luminance. Finally, we developed a PD for each of the 2D-3D natural scene structures. Since any joint 2D-3D natural scene patch is a combination of the samples of these PDs, they form a faithful representation of GPDWI and can be used a visual code book.

We used these PDs to estimate 3D scenes from 2D images and to categorize 2D-3D natural scenes. Our results showed that accurate 3D vision from a single monocular view is achievable in many situations and that near-human performance can be achieved on categorizing 2D-3D natural scenes. We thus conclude that the visual code book obtained here captures faithfully the extraordinarily complex 2D-3D natural scene statistics and supports a range of tasks of natural vision.



This material is based upon work supported by, or in part by, the U. S. Army Research Laboratory and the U. S. Army Research Office under contract/grant number W911NF-10-1-0303. This work was supported by a VDI/GHSU pilot award and the Knights Templar Education Foundation.

Authors’ Affiliations

Brain and Behavior Discovery Institute, Georgia Health Sciences University
Department of Ophthalmology, Georgia Health Sciences University
Vision Discovery Institute, Georgia Health Sciences University


  1. Doha K, Ishii S, Pouget A, Rao RPN: Bayesian Brain: Probabilistic Approaches to Neural Coding. 2007, Cambridge: MIT pressGoogle Scholar
  2. Geisler WS: Visual perception and the statistical properties of natural scenes. Ann. Rev. Psy. 2008, 59: 167-192. 10.1146/annurev.psych.58.110405.085632.View ArticleGoogle Scholar
  3. Purves D: Visual perception. Handbook of Neuroscience for the Behavioral Sciences. Edited by: Berntson GG, Cacioppo JR. 2010, New York: John Wiley and Sons, 1: 224-250.Google Scholar
  4. Mumford D, Desolneux A: Pattern Theory-The Stochastic Analysis of Real-World Signals. 2010, Natick: A K Peters, LtdGoogle Scholar
  5. Trommershauser J, Kording K, Landy ML: Sensory cue integration. 2011, New York: Oxford Univ. PressView ArticleGoogle Scholar


© Wan and Yong; licensee BioMed Central Ltd. 2012

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.