- Poster presentation
- Open Access
High-performance classification of contour percepts from EEG recordings
BMC Neuroscience volume 12, Article number: P94 (2011)
Contour integration is a fundamental process for visual scene segmentation and object recognition. Consequently, human observers are very efficient in detecting configurations of aligned edge elements in a background of randomly oriented distracters. Neural signatures of contour integration processes have been found in electrophysiological recordings in the early visual areas of primates, and in EEG signals from the occipital areas in human subjects. However, the corresponding differences in the signals between stimuli containing contours or no contours are normally small, and only show up after extensive averaging over trials. In this contribution, we investigate neural signatures of contour integration processes in EEG recordings by classifying the presence or absence of contours on a trial-by-trial basis from the recorded data. Stimuli consisted of fields of oriented Gabor elements, which were positioned randomly on the screen. Half of the stimuli contained an elliptic contour, which was formed by 13 colinearily aligned edges. In a two-alternative-forced-choice task, 20 observers had to indicate the presence or absence of a contour by pressing a corresponding response button. To our surprise, classification performance on the EEG data can be as high as 78% in single observers, averaging at about 64% over our 20 observers. Given that all stimuli have the same number of ~350 Gabor elements and only differ in the alignment of a small subset of 13 edges, differences in the EEG and in its classification reflect differences between perceptual states, rather than differences between physical stimuli. In the context of constructing EEG-based brain-computer interfaces, these perceptual differences may serve as an additional channel of information for paradigms like SSVEPs which otherwise use different visual stimuli to evoke maximally distinct brain activity patterns. Figure 1.
This work has been supported by the BMBF (Bernstein Group for Computational Neuroscience, Grant 1GQ0705, and ‘Innovationswettbewerb Medizintechnik’ 01 EZ 0867).