Integration of Visual Information in Auditory Cortex Promotes Auditory Scene Analysis through Multisensory Binding

Neuron. 2018 Feb 7;97(3):640-655.e4. doi: 10.1016/j.neuron.2017.12.034. Epub 2018 Jan 26.

Abstract

How and where in the brain audio-visual signals are bound to create multimodal objects remains unknown. One hypothesis is that temporal coherence between dynamic multisensory signals provides a mechanism for binding stimulus features across sensory modalities. Here, we report that when the luminance of a visual stimulus is temporally coherent with the amplitude fluctuations of one sound in a mixture, the representation of that sound is enhanced in auditory cortex. Critically, this enhancement extends to include both binding and non-binding features of the sound. We demonstrate that visual information conveyed from visual cortex via the phase of the local field potential is combined with auditory information within auditory cortex. These data provide evidence that early cross-sensory binding provides a bottom-up mechanism for the formation of cross-sensory objects and that one role for multisensory binding in auditory cortex is to support auditory scene analysis.

Keywords: attention; auditory cortex; auditory-visual; binding; cross-modal; ferret; multisensory; sensory cortex; visual cortex.

Publication types

  • Research Support, N.I.H., Extramural
  • Research Support, Non-U.S. Gov't

MeSH terms

  • Acoustic Stimulation
  • Action Potentials
  • Animals
  • Auditory Perception / physiology*
  • Female
  • Ferrets
  • Neurons / physiology*
  • Photic Stimulation
  • Visual Cortex / physiology*
  • Visual Perception / physiology*