Sound-contingent visual motion aftereffect
© Hidaka et al; licensee BioMed Central Ltd. 2011
Received: 24 December 2010
Accepted: 15 May 2011
Published: 15 May 2011
After a prolonged exposure to a paired presentation of different types of signals (e.g., color and motion), one of the signals (color) becomes a driver for the other signal (motion). This phenomenon, which is known as contingent motion aftereffect, indicates that the brain can establish new neural representations even in the adult's brain. However, contingent motion aftereffect has been reported only in visual or auditory domain. Here, we demonstrate that a visual motion aftereffect can be contingent on a specific sound.
Dynamic random dots moving in an alternating right or left direction were presented to the participants. Each direction of motion was accompanied by an auditory tone of a unique and specific frequency. After a 3-minutes exposure, the tones began to exert marked influence on the visual motion perception, and the percentage of dots required to trigger motion perception systematically changed depending on the tones. Furthermore, this effect lasted for at least 2 days.
These results indicate that a new neural representation can be rapidly established between auditory and visual modalities.
New neural representations can be established even in the adult brain: After an exposure to repeated alternations of red contracting and green expanding spirals, the red stationary spiral appeared to be expanding, while the green stationary spiral appeared to be contracting. This phenomenon is called as contingent motion aftereffect and has been reported only in visual [1, 2] and auditory  domains. However, perceptual events can also involve multiple sensory modalities simultaneously; for instance, visual movements often accompany a corresponding sound in the real world. Thus, the perceptual systems adequately integrate diverse information from different sensory modalities in order to create a robust perception . Therefore, it is possible that contingent motion aftereffects also occur across sensory modalities. Here, we demonstrate visual motion aftereffects contingent on arbitrary sounds.
Participants and apparatus
Nine participants, including the authors, had normal or corrected-to-normal vision and normal hearing. Apart from the authors, the participants were naïve to the purpose of the experiments. Informed consent was obtained from each participant before conducting the experiments. All procedures were approved by the local ethics committee of Tohoku University.
Visual stimuli were presented on a 24-inch CRT display (refresh rate: 60 Hz) with a viewing distance of 1 m. Auditory stimuli were generated digitally (sampling frequency: 44.1 kHz) and delivered via headphones. The synchronization of the visual and auditory stimuli was confirmed using a digital oscilloscope. The participants were instructed to place their heads on a chin rest, and the experiments were conducted in a dark room.
for visual fixation, a red circle (diameter: 0.4 deg; luminance: 17.47 cd/m2) was presented on a black background. A global motion display containing 300 white dots (5.12 cd/m2) was presented as visual stimuli, on the right of the fixation circle. Each dot was 0.25 deg in diameter and was randomly located within 5 deg in diameter of an invisible circular window. The global motion display was presented at an eccentricity of 5 deg. The target motion signal was presented for 500 ms, and the dots coherence was manipulated: 3.75%, 7.5%, 15%, or 30% of dots moved either leftward or rightward as the target direction, while the remaining dots moved in random directions except for the target motion direction; 0% coherence of the moving dots was also included. The lifetime and velocity of each dot was 12 frames and 2.0 deg/s, respectively. Auditory stimulus (85 dB SPL, 500 ms in duration, and 5 ms rise and fall time) was either a high (2 kHz) or low (500 Hz) frequency tone.
The experiment consisted of 3 sessions--pre-test, exposure, and post-test. In the exposure session, global motion display with 100% coherence was presented. The duration of the display was 500 ms. For 5 participants, the onset of the leftward motion was synchronized to a tone burst (500 ms in duration) of high (2 kHz) frequency (leftward-sound condition), while the rightward motion was synchronized with that of the low (500 Hz) frequency (rightward-sound condition). The opposite pairing was used for the remaining 4 participants. The participants were instructed to look intently at the fixation. The presentation of the paired visual and auditory stimuli was repeated 360 times so that it lasted for 3 minutes. The visual motion directions were alternated during the presentation.
In the pre- and post-test sessions, discriminate thresholds for motion direction were measured using the method of constant stimuli. In each trial, the coherence of global motion display was randomly assigned. The onset of the display was synchronized with the tone burst of the high or low frequency. The no-sound condition was also tested. The participants were asked to judge whether the visual stimulus moved leftward or rightward. Each pre- and post-test session consisted of 270 trials; 9 coherences of moving dots × 3 auditory conditions (2 sound frequencies and 1 no-sound condition) × 10 repetitions. Each condition was randomly presented and counterbalanced among the trials. It took almost 10 minutes to complete each test session.
Effects of the exposure
We plotted the proportion of rightward motion perception against the dots' coherences as psychometric functions. Before the prolonged exposure to the tones and visual motion, the psychometric functions in each condition were almost identical (Figure 1B). To determine subjective motion nulling points (SMNPs), we estimated the 50% point of rightward motion perception by fitting a cumulative normal-distribution function to each participant's psychometric function (Figure 1C). We confirmed that the tones did not affect visual motion perception at all. However, after the exposure, they remarkably affected visual motion perception. In the post-exposure test session, the psychometric function shifted to rightward visual motion in the leftward-sound condition and to leftward visual motion in the rightward-sound condition (Figure 1B, C). These data patterns showed that the tones paired with the rightward/leftward visual motion perceptually suppressed the opposite global visual motion and enhanced the consistent motion perception of the paired motion information. A two-way repeated measures analysis of variance (ANOVA) with tests (2; pre/post) × auditory conditions (3) showed the significant interaction effect between the factors (F(2 16) = 10.25, p < .005). Regarding a simple main effect of the auditory conditions in the post-test (F(2, 32) = 24.18, p < .001), a post hoc test (Tukey's HSD) revealed that the SMNPs were different among the auditory conditions (p < .05). In contrast, the simple main effect of the auditory conditions in the pre-test was not significant (F(2, 32) = .20, p = .82). These results indicate a robust sound-contingent visual motion aftereffect; sounds can induce visual motion perception for the global motion stimuli in the same direction as the exposed stimuli.
Long-lasting effect of the exposure
It is well known that contingent aftereffects persist for a long time [1, 3, 5]. To estimate the persistence of the audiovisual associations in the sound-contingent visual motion aftereffect, we conducted the post-test session 2 days after the exposure (Figure 1C). Again, the ANOVA showed the significant interaction effect between the factors (F(2 16) = 8.18, p < .005). Concerning a simple main effect of the auditory conditions in the post-test (F(2, 32) = 21.55, p < .001), the post hoc test revealed that the SMNPs were different among the auditory conditions (p < .05). The simple main effect of the auditory conditions in the pre-test did not reach significance (F(2, 32) = .53, p = .59). These results indicate that the effect of the exposure lasted for at least 2 days.
Selectivity of visual field
The present study focused on the sound-contingent visual motion aftereffect; arbitrary sound can induce visual motion perception in the previously presented manner after the short-term exposure of paired sound and visual motion information. The arbitrary sound frequency and visual motion direction may associate rather easily after the prolonged exposure of these stimuli. We also found that the sound-contingent visual motion aftereffect persists for at least 2 days. These results indicate that the short-term presentation of paired sound and visual motion information was enough to establish a long-term contingent motion aftereffect. Further, the sound-contingent motion aftereffect was not transferred between the visual fields. This implies that the sound and visual motion information is associated at relatively early stages of perceptual processing.
Previous studies have reported on the audiovisual interaction in motion perception. While the effects of visual information on auditory motion perception have been primarily reported [8–10], recent studies have shown both the modulatory [11–13] and driving/inducing [14, 15] effects of auditory information on visual motion perception, even for visual global motion displays [6, 7, 16]. It is notable that a transient sound modulated ambiguous visual motion perception to disambiguate one by capturing the temporal positional information of a moving visual stimulus  and that sounds containing motion information triggered [14–16] or altered [6, 7, 16] visual motion perception. The audiovisual interactions in motion aftereffect were also reported. For instance, adaptation to visual stimuli moving in depth induced auditory motion aftereffect in terms of changes in perceived sound intensity . It was also reported that visual motion information modulated auditory motion aftereffect . Adaptation to auditory motion also induced the visual motion aftereffect, although the effect was limited to the vertical plane . It is worth noting about these findings that the auditory or visual adapter was in motion. Contrary to the above-mentioned findings, the sounds used in this study had no spatiotemporal or motion information: The tones containing only arbitrary frequency information could induce visual motion perception after the short-term exposure of paired tones and visual motion. On the basis of these facts, we regard our findings as showing the audiovisual contingent aftereffect.
A previous study showed that tones could induce visual apparent motion perception to a static blinking visual stimulus after a prolonged exposure to alternating left-right visual stimuli together with high or low frequency tone, wherein the onset of each tone was synchronized with that of the visual stimuli . However, it remains unclear whether the tones were associated with visual apparent motion or with positional information (left or right) of the visual stimuli. In our study, motion perception was derived from the integrated visual motion signal of dots in global motion display, and not from the positional information of each dot. Therefore, our results clearly demonstrated audiovisual contingent motion aftereffect: Single tone can be directly associated with motion directional information (leftward or rightward) and can act as a driver for visual motion perception.
It should be noted that some phenomenal aspects of our findings differ from the unimodal contingent motion aftereffects; for instance, in our study, the sound-contingent visual motion aftereffect was positive (i.e., a tone associated with leftward motion induces leftward motion perception), whereas it was negative in the unimodal contingent aftereffects (i.e., a stimulus associated with leftward motion induces rightward motion perception). Studies on audiovisual association learning also reported a positive effect. After the presentation of paired auditory and visual moving stimuli, auditory motion information was found to improve the discrimination performance for visual motion . The audiovisual association learning in motion perception was observed only when spatiotemporal  or situational consistency  was maintained between the stimuli. However, in the present study the arbitrary sound contained no explicit spatiotemporal or motion information. Moreover, a few minutes' observation of the stimuli without any task could induce a motion aftereffect in our study, while the association learning usually needs explicit training wherein the participants are engaged in required tasks [21–23]. These points suggest that the findings of the present study cannot be fully explained by association learning.
In contingent motion aftereffect, new cortical units or representations are established by perceptual learning [2, 3]. In line with the previous study showing positive audiovisual temporal aftereffects , the findings of our study indicate that perceptual systems can rapidly form associations between single sound and visual motion information and establish a new neural representation between auditory and visual modalities with respect to motion perception. The negative aftereffect seen in the unimodal contingent aftereffects suggests that a prolonged exposure of paired stimuli (red contraction and green expansion) establishes cross-association in new neural representations (red expansion and green contraction). In contrast, the positive aftereffect seen in the sound-contingent motion aftereffect indicates that the exposed audio-visual information is straightforwardly bound together and establishes new neural representations. It is noteworthy that the unimodal contingent motion aftereffects require more than 10 minutes of exposure, while a few minutes of exposure can associate sound with visual motion in the sound-contingent motion aftereffect. Future research should focus on the differences in the functional characteristics and underlying mechanisms between audiovisual and unimodal contingent aftereffects.
The current study focused on the sound-contingent motion aftereffect; the presentation of paired arbitrary sounds and motion directional information for few minutes resulted in auditory-induced effect on visual motion directional perception. This auditory effect was positive as it replicated the previous paired presentation, and it lasted for at least 2 days. The findings of our study indicate that the perceptual systems can rapidly form a direct association between a sound without explicit spatiotemporal or motion information and visual motion information and that they can establish a new neural representation between auditory and visual modalities.
We appreciate the helpful comments and suggestions by two anonymous reviewers. This research was supported by the Ministry of Education, Culture, Sports, Science and Technology, Grant-in-Aid for Specially Promoted Research (No. 19001004).
- Favreau OE, Emerson VF, Corballis MC: Motion perception: a color-contingent aftereffect. Science. 1972, 176: 78-79. 10.1126/science.176.4030.78.View ArticlePubMedGoogle Scholar
- Mayhew JEW, Anstis SM: Movement aftereffects contingent on color, intensity, and pattern. Percept Psychophys. 1972, 12: 77-85. 10.3758/BF03212847.View ArticleGoogle Scholar
- Dong CJ, Swindale NV, Cynader MS: A contingent aftereffect in the auditory system. Nat Neurosci. 1999, 2: 863-865. 10.1038/13161.View ArticlePubMedGoogle Scholar
- Ernst MO, Bülthoff HH: Merging the senses into a robust percept. Trends Cogn Sci. 2004, 8: 162-169. 10.1016/j.tics.2004.02.002.View ArticlePubMedGoogle Scholar
- Hepler N: Color: a motion-contingent aftereffect. Science. 1968, 162: 376-377. 10.1126/science.162.3851.376.View ArticlePubMedGoogle Scholar
- Meyer GF, Wuerger SM: Cross-modal integration of auditory and visual motion signals. Neuroreport. 2001, 12: 2557-2560. 10.1097/00001756-200108080-00053.View ArticlePubMedGoogle Scholar
- Alais D, Burr D: No direction-specific bimodal facilitation for audiovisual motion detection. Brain Res Cogn Brain Res. 2004, 19: 185-194. 10.1016/j.cogbrainres.2003.11.011.View ArticlePubMedGoogle Scholar
- Soto-Faraco S, Lyons J, Gazzaniga M, Spence C, Kingstone A: The ventriloquist in motion: Illusory capture of dynamic information across sensory modalities. Cogn Brain Res. 2002, 14: 139-146. 10.1016/S0926-6410(02)00068-X.View ArticleGoogle Scholar
- Soto-Faraco S, Spence C, Kingstone A: Multisensory contributions to the perception of motion. Neuropsychologia. 2003, 41: 1847-1862. 10.1016/S0028-3932(03)00185-4.View ArticlePubMedGoogle Scholar
- Soto-Faraco S, Spence C, Kingstone A: Cross-modal dynamic capture: Congruency effects in the perception of motion across sensory modalities. J Exp Psychol Hum Percept Perform. 2004, 30: 330-345.View ArticlePubMedGoogle Scholar
- Sekuler R, Sekuler AB, Lau R: Sound alters visual motion perception. Nature. 1997, 385: 308-10.1038/385308a0.View ArticlePubMedGoogle Scholar
- Watanabe K, Shimojo S: When sound affects vision: Effects of auditory grouping on visual motion perception. Psychol Sci. 2001, 12: 109-116. 10.1111/1467-9280.00319.View ArticlePubMedGoogle Scholar
- Freeman E, Driver J: Direction of visual apparent motion driven solely by timing of a static sound. Curr Biol. 2008, 18: 1262-1266. 10.1016/j.cub.2008.07.066.PubMed CentralView ArticlePubMedGoogle Scholar
- Hidaka S, Manaka Y, Teramoto W, Sugita Y, Miyauchi R, Gyoba J, Suzuki Y, Iwaya Y: The alternation of sound location induces visual motion perception of a static object. PLoS One. 2009, 4: e8188-10.1371/journal.pone.0008188.PubMed CentralView ArticlePubMedGoogle Scholar
- Teramoto W, Manaka Y, Hidaka S, Sugita Y, Miyauchi R, Sakamoto S, Gyoba J, Iwaya Y, Suzuki Y: Visual motion perception induced by sounds in vertical plane. Neurosci Lett. 2010, 479: 221-225. 10.1016/j.neulet.2010.05.065.View ArticlePubMedGoogle Scholar
- Hidaka S, Teramoto W, Sugita Y, Manaka Y, Sakamoto S, Suzuki Y: Auditory motion information drives visual motion perception. PLoS ONE. 2011, 6: e17499-10.1371/journal.pone.0017499.PubMed CentralView ArticlePubMedGoogle Scholar
- Kitagawa N, Ichihara S: Hearing visual motion in depth. Nature. 2002, 416: 172-174. 10.1038/416172a.View ArticlePubMedGoogle Scholar
- Vroomen J, de Gelder B: Visual motion influences the contingent auditory motion aftereffect. Psychol Sci. 2003, 14: 357-361. 10.1111/1467-9280.24431.View ArticlePubMedGoogle Scholar
- Jain A, Sally SL, Papathomas TV: Audiovisual short-term influences and aftereffects in motion: Examination across three sets of directional pairings. J Vis. 2008, 8 (7): 1-13. 10.1167/8.7.1.View ArticlePubMedGoogle Scholar
- Teramoto W, Hidaka S, Sugita Y: Sounds move a static visual object. PLoS One. 2010, 5: e12255-10.1371/journal.pone.0012255.PubMed CentralView ArticlePubMedGoogle Scholar
- Seitz AR, Kim R, Shams L: Sound facilitates visual learning. Curr Biol. 2006, 16: 1422-1427. 10.1016/j.cub.2006.05.048.View ArticlePubMedGoogle Scholar
- Kim RS, Seitz AR, Shams L: Benefits of stimulus congruency for multisensory facilitation of visual learning. PLoS One. 2008, 3: e1532-10.1371/journal.pone.0001532.PubMed CentralView ArticlePubMedGoogle Scholar
- Michel MM, Jacobs RA: Parameter learning but not structure learning: a Bayesian network model of constraints on early perceptual learning. J Vis. 1997, 7 (4): 1-18.Google Scholar
- Fujisaki W, Shimojo S, Kashino M, Nishida S: Recalibration of audiovisual simultaneity. Nat Neurosci. 2004, 7: 773-778. 10.1038/nn1268.View ArticlePubMedGoogle Scholar