Skip to main content

Analysis of coupled decision-making modules for multisensory integration

We were interested in how two coupled decision-making modules behave. This is for example interesting in multisensory integration, in which both auditory and visual precepts have to be integrated into one common percept. We used a biophysically realistic neural model consisting of integrate-and-fire neurons with detailed synaptic channels. We studied the influence of the strength of the cross-connection between the two decision-making modules on the performance of the model. The performance was assessed by how often the system can correctly extract the stimulus even though just weak input was applied. We found that the optimal performance of the coupled modules is achieved by a certain cross-connection strength, which is independent of the amplitude of the stimulus input. This means that once an optimal cross-connection has been learned it can be used for all types of stimulus inputs to achieve an optimal performance. We also present the mechanism, which is responsible for this improvement: Inconsistent constellations in the two modules converge to the correct response. We could also simulate the law of inverse effectiveness in our model. We related the strength of the input bias to the multisensory integration index and found an inversely correlated relationship such as observed in experimental data [1]. We also investigated a three-module architecture, in which two primary sensory areas like the auditory and the visual one are connected by a third, integrating module. The 3rd module could correspond to higher processing areas such as the STS, which mediates between the two primary sensory areas. We show that there are similar dynamics as in the two-module case, although the necessary coupling strength between the modules is increased. The coupling is more indirect than in the two-module case. We conclude that decision-making modules can be coupled to increase performance compared to single decision-making module.

References

  1. Ghazanfar AA, Maier JX, Hoffman KL, Logothetis NK: Multisensory integration of dynamic faces and voices in rhesus monkey auditory cortex. J Neurosci. 2005, 25 (20): 5004-5012. 10.1523/JNEUROSCI.0799-05.2005.

    Article  PubMed  CAS  Google Scholar 

Download references

Acknowledgements

The study has been supported by the Volkswagen Foundation.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Marco Loh.

Rights and permissions

Open Access This article is published under license to BioMed Central Ltd. This is an Open Access article is distributed under the terms of the Creative Commons Attribution 2.0 International License (https://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Loh, M., Andrzejak, R.G. & Deco, G. Analysis of coupled decision-making modules for multisensory integration. BMC Neurosci 8 (Suppl 2), P19 (2007). https://doi.org/10.1186/1471-2202-8-S2-P19

Download citation

  • Published:

  • DOI: https://doi.org/10.1186/1471-2202-8-S2-P19

Keywords