Elsevier

Neuropsychologia

Volume 53, January 2014, Pages 84-93
Neuropsychologia

Oscillatory mechanisms underlying the enhancement of visual motion perception by multisensory congruency

https://doi.org/10.1016/j.neuropsychologia.2013.11.005Get rights and content

Highlights

  • Auxiliary sounds can enhance visual motion perception.

  • Occipital alpha band oscillations are modulated by audio–visual congruency.

  • Alpha modulation predicts multisensory perceptual benefits.

  • Slow oscillations are critical for multisensory processes and perception.

Abstract

Multisensory interactions shape every day perception and stimuli in one modality can enhance perception in another even when not being directly task relevant. While the underlying neural principles are slowly becoming evident, most work has focused on transient stimuli and little is known about those mechanisms underlying audio–visual motion processing. We studied the facilitation of visual motion perception by auxiliary sounds, i.e. sounds that by themselves do not provide the specific evidence required for the perceptual task at hand. In our experiment human observers became significantly better at detecting visual random dot motion when this was accompanied by auxiliary acoustic motion rather than stationary sounds. EEG measurements revealed that both auditory and visual motion modulated low frequency oscillations over the respective sensory cortices. Using single trial decoding we quantified those oscillatory signatures permitting the discrimination of visual motion similar to the subject's task. This revealed visual motion-related signatures in low (1–4 Hz) and alpha (8–12 Hz) bands that were significantly enhanced during congruent compared to disparate audio–visual conditions. Importantly, the auditory enhancement of these oscillatory signatures was predictive of the perceptual multisensory facilitation. These findings emphasise the importance of slow and alpha rhythms for perception in a multisensory context and suggest that acoustic motion can enhance visual perception by means of attention or priming-related mechanisms that are reflected in rhythmic activity over parieto-occipital regions.

Introduction

Stimuli presented to one modality can enhance our perceptual performance in detecting or discriminating stimuli in another, even without providing information about the task relevant feature. For example, a tone can improve the detection of dim lights (Chen et al., 2011, Lippert et al., 2007, McDonald et al., 2000, Noesselt et al., 2010) or facilitate the search for dynamic visual targets (Van der Burg, Olivers, Bronkhorst, & Theeuwes, 2008). Similar effects occur for other combinations of sensory modalities (Jaekl and Soto-Faraco, 2010, Lovelace et al., 2003, Thorne and Debener, 2008), can be specific to task instructions or stimulus eccentricity (Chen et al., 2011, Gleiss and Kayser, 2013, Jaekl and Soto-Faraco, 2010, Leo et al., 2008), and may relate to mechanisms based on attentional capture or stimulus-triggered changes in sensory gain (Chen et al., 2011, Lippert et al., 2007). While many studies on auxiliary multisensory benefits focus on brief or stationary stimuli, considerably less is known about the interaction of auditory and visual motion.

Previous work has shown that both moving and stationary sounds can affect the precision or quality of a visual motion percept, also when the sound is not of primary relevance for the task at hand. For example, brief sounds can alter bi-stable visual motion precepts (Sekuler, Sekuler, & Lau, 1997) and task-irrelevant acoustic motion can enhance visual motion detection (Kim, Peters, & Shams, 2012). Other work has shown that audio–visual motion features can combine for a general perceptual enhancement (Alais and Burr, 2004, Senkowski et al., 2007, Wuerger et al., 2003) or more specific feature integration (Harrison et al., 2010, Lopez-Moliner and Soto-Faraco, 2007, Meyer et al., 2005). This suggests that auditory and visual motion evidence interact in different ways and experimental contexts (see also (Cappe, Thelen, Romei, Thut, & Murray, 2012; Senkowski et al., 2007)). In search for an underlying neural substrate functional imaging studies localised brain areas mediating the integration of acoustic and visual motion evidence (Alink et al., 2008, Guipponi et al., 2013, Lewis and Noppeney, 2010, Scheef et al., 2009). However, the specific neural mechanisms underlying a sound driven enhancement of visual motion detection remain poorly understood.

To link perceptual multisensory benefits with the neural processing of visual motion we studied a paradigm in which sounds enhance the subject’s accuracy in identifying coherent visual motion without providing direct evidence that would allow solving the perceptual task based on just the acoustic evidence (Kim et al., 2012). Specifically, we used a 2-interval forced choice paradigm in which subjects had to identify which of two intervals contained coherent visual motion. The auditory stimulus was the same during both intervals and by itself did not identify the stimulus interval containing coherent visual motion. However, on half the trials the sound provided congruent motion information that could enhance visual motion processing and hence the perceptual performance.

Our hypothesis about the neural mechanisms underlying the perceptual multisensory benefit in this task was based on recent suggestions for a prominent role of oscillatory brain activity in mediating multisensory interactions (Schroeder and Lakatos, 2009, Schroeder et al., 2008). Experimental support for this has been found in different brain areas and paradigms (Bauer et al., 2009, Fiebelkorn et al., 2011, Gomez-Ramirez et al., 2011, Kayser et al., 2008, Lakatos et al., 2009, Mercier et al., 2013, Thorne et al., 2011). For example, several studies found that brief and transient sounds can shape the amplitude and phase of oscillatory activity over occipital areas (Naue et al., 2011) and that this modulation of low frequency oscillations directly relates to perceptual performance in visual tasks (Romei et al., 2012, Romei et al., 2009, Thut et al., 2012). We hence hypothesised that similar oscillatory mechanisms may underlie an auditory driven enhancement of visual motion perception. We directly tested this hypothesis using methods of single trial decoding to link stimulus related oscillatory signatures with multisensory perceptual benefits.

Section snippets

General experimental procedures

Adult volunteer subjects (aged 18–35, both sexes) were paid to participate in the experiments. All reported normal hearing, normal or corrected to normal vision and gave written informed consent prior to participation. The experiments were conducted according to the Declaration of Helsinki and were approved by the joint ethics committee of the University Clinic and the Max Planck Institute for Biological Cybernetics Tübingen. Experiments were performed in a sound-attenuated and dark room.

Results

The goal of our experiment was to study the neural underpinnings of the behavioural facilitation in visual motion identification by auxiliary sounds. Subjects (n=17) discriminated which of two random dot patterns contained coherent left- or rightward motion, while either a congruently moving or stationary acoustic context was presented (Fig. 1A). On each trial the same sound was presented during both visual stimuli so that this by itself did not provide sufficient information to perform the

Multisensory enhancement of visual motion perception

We studied the enhancement of visual motion perception by auxiliary sounds. While on half of the trials the sound provided unambiguous information about motion direction, this evidence by itself could not be used to perform the perceptual task. Rather, the moving sound served to enhance the processing of visual information and to thereby determine which of two intervals contained the moving visual stimulus. Previous work has suggested that pre-decisional and low-level sensory interactions can

Acknowledgements

This work was supported by the Max Planck Society and profited from support of the Bernstein Center for Computational Neuroscience, Tübingen, funded by the German Federal Ministry of Education and Research (BMBF; FKZ: 01GQ1002). The funders had no role in the design of the study. We are grateful to Joachim Gross for helpful advice on source location and Gregor Thut for comments on the manuscript.

References (96)

  • M. Lippert et al.

    Improvement of visual contrast detection by a simultaneous sound

    Brain Research

    (2007)
  • C.T. Lovelace et al.

    An irrelevant light enhances auditory detection in humans: A psychophysical analysis of multisensory integration in stimulus detection

    Cognitive Brain Research

    (2003)
  • E. Maris et al.

    Nonparametric statistical testing of EEG- and MEG-data

    Journal Neuroscience Methods

    (2007)
  • M.R. Mercier et al.

    Auditory-driven phase reset in visual cortex: Human electrocorticography reveals mechanisms of early multisensory integration

    Neuroimage

    (2013)
  • S. Molholm et al.

    Multisensory auditory–visual interactions during early sensory processing in humans: a high-density electrical mapping study

    Brain research. Cognitive brain research

    (2002)
  • V. Romei et al.

    Rhythmic TMS over parietal cortex links distinct brain frequencies to global versus local visual processing

    Current Biology

    (2011)
  • V. Romei et al.

    Sounds reset rhythms of visual cortex and corresponding human visual perception

    Current Biology

    (2012)
  • V. Romei et al.

    Preperceptual and stimulus-selective enhancement of low-level human visual cortex excitability by sounds

    Current Biology

    (2009)
  • O.W. Sakowitz et al.

    Spatio-temporal frequency characteristics of intersensory components in audiovisually evoked potentials

    Brain research. Cognitive brain research

    (2005)
  • D. Sanabria et al.

    Perceptual and decisional contributions to audiovisual interactions in the perception of apparent motion: a signal detection study

    Cognition

    (2007)
  • P. Sauseng et al.

    Brain oscillatory substrates of visual short-term memory capacity

    Current Biology

    (2009)
  • L. Scheef et al.

    Multimodal motion processing in area V5/MT: evidence from an artificial class of audio–visual events

    Brain Research

    (2009)
  • I.M. Schepers et al.

    Noise alters beta-band activity in superior temporal cortex during audiovisual speech processing

    Neuroimage

    (2013)
  • C.E. Schroeder et al.

    Multisensory contributions to low-level, 'unisensory' processing

    Current Opinion in Neurobiology

    (2005)
  • C.E. Schroeder et al.

    Low-frequency neuronal oscillations as instruments of sensory selection

    Trends Neuroscience

    (2009)
  • C.E. Schroeder et al.

    Neuronal oscillations and visual amplification of speech

    Trends in Cognitive Sciences

    (2008)
  • D. Senkowski et al.

    Multisensory processing of naturalistic objects in motion: a high-density electrical mapping and source estimation study

    Neuroimage

    (2007)
  • A. Alink et al.

    Capture of auditory motion by vision is represented by an activation shift from auditory to visual motion cortex

    Journal of Neuroscience

    (2008)
  • S. Banerjee et al.

    Oscillatory alpha-band mechanisms and the deployment of spatial attention to anticipated auditory and visual target locations: Supramodal or sensory-specific control mechanisms?

    Journal of Neuroscience

    (2011)
  • A.L. Beer et al.

    Attending to visual or auditory motion affects perception within and across modalities: an event-related potential study

    European Journal of Neuroscience

    (2005)
  • Y. Benjamini et al.

    Controlling the false discovery rate: a practical and powerful approach to multiple testing

    Journal of the Royal Statistical Society Series B

    (1995)
  • D.H. Brainard

    The psychophysics toolbox

    Spat Vis

    (1997)
  • Capilla, A., Schoffelen, J. M., Paterson, G., Thut, G., & Gross, J. (2012). Dissociated alpha-band modulations in the...
  • C. Cappe et al.

    Looming signals reveal synergistic principles of multisensory integration

    Journal of Neuroscience

    (2012)
  • C. Cappe et al.

    Auditory–visual multisensory interactions in humans: timing, topography, directionality, and sources

    Journal of Neuroscience

    (2010)
  • Y.C. Chen et al.

    Synchronous sounds enhance visual sensitivity without reducing target uncertainty

    Seeing and Perceiving

    (2011)
  • Y.C. Chen et al.

    Crossmodal semantic priming by naturalistic sounds and spoken words enhances visual sensitivity

    Journal of Experimental Psychology: Human Perception and Performance

    (2011)
  • C. Dahl et al.

    Spatial organization of multisensory responses in temporal association cortex

    Journal of Neuroscience

    (2009)
  • A.J. Ecker et al.

    Auditory–visual interactions in the perception of a ball's path

    Perception

    (2005)
  • I.C. Fiebelkorn et al.

    Ready, set, reset: Stimulus-locked periodicity in behavioral performance demonstrates the consequences of cross-sensory phase reset

    Journal of Neuroscience

    (2011)
  • J.J. Foxe et al.

    Parieto-occipital approximately 10 Hz activity reflects anticipatory state of visual attention mechanisms

    Neuroreport

    (1998)
  • J.J. Foxe et al.

    The role of alpha-band brain oscillations as a sensory suppression mechanism during selective attention

    Frontiers in Psychology

    (2011)
  • S. Gleiss et al.

    Eccentricity dependent auditory enhancement of visual stimulus detection but not discrimination

    Frontiers in Integrative Neuroscience

    (2013)
  • M. Gomez-Ramirez et al.

    Oscillatory sensory selection mechanisms during intersensory attention to rhythmic auditory and visual inputs: A human electrocorticographic investigation

    Journal of Neuroscience

    (2011)
  • D.M. Green et al.

    Signal detection theory and psychophysics

    (1966)
  • O. Guipponi et al.

    Multimodal convergence within the intraparietal sulcus of the macaque monkey

    Journal of Neuroscience

    (2013)
  • N.R. Harrison et al.

    Reaction time facilitation for horizontally moving auditory–visual stimuli

    Journal of Vision

    (2010)
  • M. Hofbauer et al.

    Catching audiovisual mice: Predicting the arrival time of auditory–visual motion signals

    Cognitive, Affective, & Behavioral Neuroscience

    (2004)
  • Cited by (0)

    View full text