ReviewLinking neurons to behavior in multisensory perception: A computational review
Section snippets
Optimal cue integration
When a common source is assumed, a systematic strategy to quantify cue combination is to introduce a small discrepancy (also called conflict, disparity, or incongruency) between the cues. The conflict must be small in order to not violate the common-source assumption. In such a paradigm, the percept (estimate of the stimulus) inferred from both cues presented together will lie somewhere in between the percepts inferred from each cue individually. The intuition is that higher weight will be
Optimal cue integration with neural populations
When studying how neuronal circuits implement near-optimal cue integration, an important fact to take into account is that the responses of cortical neurons are typically very variable (Compte et al., 2003, Dean, 1981, Holt et al., 1996, Tolhurst et al., 1982). Presenting the same stimulus repeatedly will give rise to many different population responses. A first sight, such variability is a nuisance that could compromise optimality. Recent work, however, has argued that the presence of
Multisensory integration in relation to other probabilistic computations
The appeal of probabilistic population codes, in which a population pattern of activity encodes the certainty about a stimulus, is that they are not limited to multisensory perception. Ecologically important tasks often require combining pieces of uncertain sensory information with each other or with prior information. In multisensory perception, cues from different modalities get combined. Examples in other domains include perceptual decision-making (combining information over time and
Comparison with physiology
Using this theoretical framework, it is now possible to link optimal behavior to neural population activity. Imagine recording with a multi-electrode array from a population of multisensory neurons in an awake, behaving animal engaged in optimal cue combination (as tested behaviorally). Then the theory predicts that the response of multisensory neurons when two cues are presented is equal to the sum of their responses when each cue is presented separately (this follows from the fact that the
Cue combination without forced integration
The second question concerns the number of sources, or multiplicity, of multisensory cues. When an auditory and a visual stimulus are observed, they could have either the same source or different sources. In cue conflict experiments, the disparity between the cues is usually kept small, so that the observer has no difficulty imagining that they originate from the same source (forced integration). However, in natural circumstances, large disparities in space, time, or feature space occur
Towards a complete theory of multisensory integration
Three decades ago, the question was posed whether there is a unified explanation for multisensory localization judgments under conflict (Warren, 1979). Behavioral theories of Bayes-optimal cue combination have brought us closer to this goal. Not only do they explain a wide range of existing data, they are also firmly rooted in a principled, probabilistic description of the purpose of multisensory perception, which is to increase precision if two cues have a common origin, but to keep cues with
References (89)
- et al.
The ventriloquist effect results from near-optimal bimodal integration
Curr Biol
(2004) - et al.
Temporal ventriloquism: crossmodal interaction on the time dimension 1. Evidence from auditory–visual temporal order judgment
Int J Psychophysiology
(2003) - et al.
Multisensory integration, perception and ecological validity
Trends in Cogn Sci
(2003) - et al.
Edge co-occurrence in natural images predicts contour grouping performance
Vision Res
(2001) - et al.
Coordinate transformations and sensory integration in the detection of spatial orientation and self-motion: from models to experiments
Progr. Brain Res
(2007) Optimal integration of texture and motion cues to depth
Vision Res
(1999)Mixture models and the probabilistic structure of depth cues
Vision Res
(2003)- et al.
Do humans optimally integrate stereo and texture information for judgments of surface slant?
Vision Research.
(2003) - et al.
Optimal computation with attractor networks.
Journal of Physiology (Paris)
(2003) Bayesian inference of form and shape.
Progr. Brain Res.
(2006)
Neurons and behavior: the same rules of multisensory integration
Brain Res
Using Bayes' rule to model multisensory enhancement in the superior colliculus
Neural Computation
Neural correlations, population coding, and computation
Nat. Rev. Neurosci
What you see and hear is what you get.
Curr. Biology
Bayesian integration of visual and auditory signals for spatial localization
J. Opt. Soc. Am. A. Opt. Image Sci. Vis.
Statistical criteria in fMRI studies of multisensory integration
Neuroinformatics
Crossmodal integration in the primate superior colliculus underlying the preparation and initiation of saccadic eye movements
J. Neurophysiol
Automatic visual bias of perceived auditory location
Psychon. B. Rev
Ventriloquism: a case of crossmodal perceptual grouping
Auditory Scene Analysis: The Perceptual Organization of Sound, Vol.
Feeling what you hear: auditory signals can modulate tactile tap perception.
Exp Brain Res.
Vision and touch are automatically integrated for the perception of sequences of events
J Vis
The “ventriloquist effect”: visual dominance or response bias?
Perception and Psychophysics
Data Fusion for Sensory Information Processing Systems, Vol.
A two-stage model for visual–auditory interaction in saccadic latencies
Percept Psychophys
Multisensory interaction in saccadic reaction time: a time-window-of-integration model
J Cogn Neurosci
Temporally irregular mnemonic persistent activity in prefrontal neurons of monkeys during a delayed response task
J Neurophys
Auditory–visual interactions subserving goal-directed saccades in a complex scene
J Neurophysiol
The perception of emotion by ear and by eye
Cogn. Emot
The variability of discharge of simple cells in the cat striate cortex
Exp Brain Res
Reading population codes: a neural implementation of ideal observers
Nature Neuroscience
Efficient computation and cue integration with noisy population codes
Nature Neuroscience
Bimodal and trimodal multisensory enhancement: effects of stimulus onset and intensity on reaction time
Percept Psychophys
Modeling spatial effects in visual-tactile saccadic reaction time
Percept Psychophys
Ecological statistics of Gestalt laws for the perceptual organization of contours
J. Vision
Humans integrate visual and haptic information in a statistically optimal fashion
Nature
Learning to integrate arbitrary signals from vision and touch
Journal of Vision
Bayesian contour integration
Percept. Psychophys
Neural basis of hearing in real-world situations
Annu. Rev. Psychol
The ‘ideal homunculus’: statistical inference from neural population responses.
Response variability of neurons in primary visual cortex (V1) of alert monkeys
J. Neurosci
Comparison of discharge variability in vitro and in vivo in cat visual cortex neurons.
J. Neurophys.
Cited by (66)
A simple vector-like law for perceptual information combination is also followed by a class of cortical multisensory bimodal neurons
2021, iScienceCitation Excerpt :Because we have much data on neural variability and no data on neural Bayesian priors, the variant of Bayesian theory which is best suited to modeling spike count data is maximum likelihood estimation (MLE), which has widely been applied to sensory data (see Ernst, 2012 and Ma and Pouget, 2008 for reviews) and which is equivalent to a full Bayesian model for relatively flat priors. In the past, it has been suggested that bimodal neurons – especially superadditive bimodal neurons (which make up a substantial minority of our data) may be poor candidates for MLE (Ma and Pouget, 2008; Beck et al., 2008) but prior sensible results with MLE for visual-vestibular bimodal neurons (Morgan et al., 2008; Gu et al., 2008; Fetsch et al., 2010; Angelaki et al., 2012) suggest that it might be interesting to try. MLE makes two strong predictions about auditory-visual responses and their variability.
Multisensory neural processing: from cue integration to causal inference
2020, Current Opinion in Physiology6.30 - Multisensory Integration for Self-Motion Perception
2020, The Senses: A Comprehensive Reference: Volume 1-7, Second EditionFitting predictive coding to the neurophysiological data
2019, Brain ResearchCausal Inference in the Multisensory Brain
2019, NeuronComputational principles and models of multisensory integration
2017, Current Opinion in Neurobiology