Oscillatory mechanisms underlying the enhancement of visual motion perception by multisensory congruency
Introduction
Stimuli presented to one modality can enhance our perceptual performance in detecting or discriminating stimuli in another, even without providing information about the task relevant feature. For example, a tone can improve the detection of dim lights (Chen et al., 2011, Lippert et al., 2007, McDonald et al., 2000, Noesselt et al., 2010) or facilitate the search for dynamic visual targets (Van der Burg, Olivers, Bronkhorst, & Theeuwes, 2008). Similar effects occur for other combinations of sensory modalities (Jaekl and Soto-Faraco, 2010, Lovelace et al., 2003, Thorne and Debener, 2008), can be specific to task instructions or stimulus eccentricity (Chen et al., 2011, Gleiss and Kayser, 2013, Jaekl and Soto-Faraco, 2010, Leo et al., 2008), and may relate to mechanisms based on attentional capture or stimulus-triggered changes in sensory gain (Chen et al., 2011, Lippert et al., 2007). While many studies on auxiliary multisensory benefits focus on brief or stationary stimuli, considerably less is known about the interaction of auditory and visual motion.
Previous work has shown that both moving and stationary sounds can affect the precision or quality of a visual motion percept, also when the sound is not of primary relevance for the task at hand. For example, brief sounds can alter bi-stable visual motion precepts (Sekuler, Sekuler, & Lau, 1997) and task-irrelevant acoustic motion can enhance visual motion detection (Kim, Peters, & Shams, 2012). Other work has shown that audio–visual motion features can combine for a general perceptual enhancement (Alais and Burr, 2004, Senkowski et al., 2007, Wuerger et al., 2003) or more specific feature integration (Harrison et al., 2010, Lopez-Moliner and Soto-Faraco, 2007, Meyer et al., 2005). This suggests that auditory and visual motion evidence interact in different ways and experimental contexts (see also (Cappe, Thelen, Romei, Thut, & Murray, 2012; Senkowski et al., 2007)). In search for an underlying neural substrate functional imaging studies localised brain areas mediating the integration of acoustic and visual motion evidence (Alink et al., 2008, Guipponi et al., 2013, Lewis and Noppeney, 2010, Scheef et al., 2009). However, the specific neural mechanisms underlying a sound driven enhancement of visual motion detection remain poorly understood.
To link perceptual multisensory benefits with the neural processing of visual motion we studied a paradigm in which sounds enhance the subject’s accuracy in identifying coherent visual motion without providing direct evidence that would allow solving the perceptual task based on just the acoustic evidence (Kim et al., 2012). Specifically, we used a 2-interval forced choice paradigm in which subjects had to identify which of two intervals contained coherent visual motion. The auditory stimulus was the same during both intervals and by itself did not identify the stimulus interval containing coherent visual motion. However, on half the trials the sound provided congruent motion information that could enhance visual motion processing and hence the perceptual performance.
Our hypothesis about the neural mechanisms underlying the perceptual multisensory benefit in this task was based on recent suggestions for a prominent role of oscillatory brain activity in mediating multisensory interactions (Schroeder and Lakatos, 2009, Schroeder et al., 2008). Experimental support for this has been found in different brain areas and paradigms (Bauer et al., 2009, Fiebelkorn et al., 2011, Gomez-Ramirez et al., 2011, Kayser et al., 2008, Lakatos et al., 2009, Mercier et al., 2013, Thorne et al., 2011). For example, several studies found that brief and transient sounds can shape the amplitude and phase of oscillatory activity over occipital areas (Naue et al., 2011) and that this modulation of low frequency oscillations directly relates to perceptual performance in visual tasks (Romei et al., 2012, Romei et al., 2009, Thut et al., 2012). We hence hypothesised that similar oscillatory mechanisms may underlie an auditory driven enhancement of visual motion perception. We directly tested this hypothesis using methods of single trial decoding to link stimulus related oscillatory signatures with multisensory perceptual benefits.
Section snippets
General experimental procedures
Adult volunteer subjects (aged 18–35, both sexes) were paid to participate in the experiments. All reported normal hearing, normal or corrected to normal vision and gave written informed consent prior to participation. The experiments were conducted according to the Declaration of Helsinki and were approved by the joint ethics committee of the University Clinic and the Max Planck Institute for Biological Cybernetics Tübingen. Experiments were performed in a sound-attenuated and dark room.
Results
The goal of our experiment was to study the neural underpinnings of the behavioural facilitation in visual motion identification by auxiliary sounds. Subjects (n=17) discriminated which of two random dot patterns contained coherent left- or rightward motion, while either a congruently moving or stationary acoustic context was presented (Fig. 1A). On each trial the same sound was presented during both visual stimuli so that this by itself did not provide sufficient information to perform the
Multisensory enhancement of visual motion perception
We studied the enhancement of visual motion perception by auxiliary sounds. While on half of the trials the sound provided unambiguous information about motion direction, this evidence by itself could not be used to perform the perceptual task. Rather, the moving sound served to enhance the processing of visual information and to thereby determine which of two intervals contained the moving visual stimulus. Previous work has suggested that pre-decisional and low-level sensory interactions can
Acknowledgements
This work was supported by the Max Planck Society and profited from support of the Bernstein Center for Computational Neuroscience, Tübingen, funded by the German Federal Ministry of Education and Research (BMBF; FKZ: 01GQ1002). The funders had no role in the design of the study. We are grateful to Joachim Gross for helpful advice on source location and Gregor Thut for comments on the manuscript.
References (96)
- et al.
No direction-specific bimodal facilitation for audiovisual motion detection
Cognitive Brain Research
(2004) - et al.
Tactile stimulation accelerates behavioral responses to visual stimuli through enhancement of occipital gamma-band activity
Vision Research
(2009) - et al.
Selective integration of auditory–visual looming cues by humans
Neuropsychologia
(2009) - et al.
Cortical cross-frequency coupling predicts perceptual outcomes
Neuroimage
(2013) - et al.
Oscillatory synchronization in large-scale cortical networks predicts perception
Neuron
(2011) - et al.
Audiovisual contrast enhancement is articulated primarily via the M-pathway
Brain Research
(2010) - et al.
An oscillatory mechanism for prioritizing salient unattended stimuli
Trends in Cognitive Sciences
(2012) - et al.
EEG alpha oscillations: the inhibition-timing hypothesis
Brain Research Review
(2007) - et al.
Neuronal oscillations and multisensory interaction in primary auditory cortex
Neuron
(2007) - et al.
The leading sense: Supramodal control of neurophysiological context by attention
Neuron
(2009)