Interregional alpha-band synchrony supports temporal cross-modal integration
Introduction
When the sound coming from a television gets out-of-sync with what is visually displayed, you immediately feel that something is wrong. Multimodal processing is ubiquitous in perception: different senses provide us with complementary evidence about external events, which can aid our responses to these events (McDonald et al., 2000, Yang et al., 2013), or can result in perceptual illusions (Alais and Burr, 2004, McGurk and MacDonald, 1976). Over the past several decades, neuroscience of multisensory processing has shifted from a strict hierarchical view of unisensory signals converging onto higher supramodal areas (Meredith and Stein, 1983, Stein and Stanford, 2008), to a growing consensus that cross-modal integration can occur even at early stages of sensory processing (Foxe et al., 2000, Ghazanfar and Chandrasekaran, 2007, Ghazanfar and Schroeder, 2006, Giard and Peronnet, 1999, Kayser and Logothetis, 2007, Martuzzi et al., 2007, Molholm et al., 2002). However, how these early-stage interactive processes are neurophysiologically organized remains a topic of active exploration (Klemen and Chambers, 2012, Sarko et al., 2013).
One proposed mechanism of neural interaction is “binding through coherence” (Fries, 2005, Varela et al., 2001, Ward, 2003, Womelsdorf et al., 2007), or the idea that effective windows of cortico-cortical communication may arise by transiently synchronized electrophysiological oscillations of the involved neural populations. Evidence is accumulating that this principle may apply to the integration of multisensory information as well (Doesburg et al., 2008, Hummel and Gerloff, 2005, Sarko et al., 2013, Senkowski et al., 2008, von Stein et al., 1999). For example, a stimulus of one modality can modulate the processing of a concurrently delivered stimulus of another modality, through the phase resetting of ongoing oscillatory activity in the corresponding primary sensory region (Diederich et al., 2012, Kayser et al., 2008, Lakatos et al., 2007). This results in increased cortical excitability, and thus improved behavioral performance towards the bimodal stimulus. Attention seems to determine which modality “controls” the phase-resetting (Lakatos et al., 2009). Given the tight link between alpha-band (8–12 Hz) activity and attentional processing (Jensen and Mazaheri, 2010, Klimesch et al., 2007), we hypothesized that during cross-modal attention, alpha phase synchrony may be an important mediator of large-scale communication between distant sensory regions (Hummel and Gerloff, 2005, Palva and Palva, 2011). Although several frequency bands have been implicated in multisensory integration (Senkowski et al., 2008), it has been proposed that phase synchronization in the alpha-band may be especially important for coordinating functional integration between cortical regions (Doesburg et al., 2009) of relatively longer inter-areal distances (Palva and Palva, 2007, Palva et al., 2005). This active role of alpha activity has been shown in a variety of cognitive and perceptual tasks, such as spatial attention (Doesburg et al., 2009), working memory (Palva et al., 2005), object recognition (Bar et al., 2006), and error-processing (van Driel et al., 2012), and may thus represent a general mechanism of coherent network functioning. Importantly, increases in interregional alpha-band phase synchrony can co-occur with local decreases in alpha-band power (Palva and Palva, 2007, Palva and Palva, 2011), where the latter may reflect attention-related “active inhibition” of task-irrelevant areas (Jensen and Mazaheri, 2010). In this study, we were in particular interested in the role of interregional alpha phase dynamics during cross-sensory integration.
Most studies on multisensory processing use brief, momentary stimuli, and focus on the spatial domain (Driver and Spence, 1998, Driver and Spence, 2000, Macaluso and Driver, 2005), or on judgments of successiveness versus simultaneity (Jaekl and Harris, 2007, Keetels and Vroomen, 2007). However, multisensory events in a continuous environment are more likely to be prolonged (Ghazanfar and Chandrasekaran, 2007), and are not necessarily linked to one spatial location; in these more naturalistic situations, correlated temporal durations can provide key evidence for integration. Moreover, it is especially interesting to study cross-modal integration of elapsed time, because the perception of auditory duration is superior to that of visual duration. This is in contrast to the more frequently investigated spatial domain, in which visual spatial localization is superior to auditory spatial localization (Burr et al., 2009, Fendrich and Corballis, 2001, Pick et al., 1969).
The purpose of the present study was to investigate the potential neural mechanisms of multimodal integration via duration perception. We here report novel evidence that inter-regional phase synchrony in the alpha-band supports multimodal duration judgments in humans. Through time–frequency decomposition of high-density EEG activity, we found that alpha synchrony was modulated by cross-modal attention and correlated with subject-specific Bayesian estimated parameters of distractor interference and lapsing.
Section snippets
Subjects
Nineteen subjects (age range 18–29, M = 22.4; 13 females) from the University of Amsterdam community participated in this study in exchange for €14 or course credits. All subjects signed an informed consent form before participation and reported to have normal or corrected-to-normal vision, and normal hearing. The study was approved by the local ethics committee; all procedures complied with relevant laws and institutional guidelines. Data of one subject were excluded from further analyses due to
Behavioral results
Nineteen human subjects performed a duration comparison task in which they judged whether the duration of a target stimulus (between 333 and 750 ms) was shorter or longer than that of a preceding 500 ms standard stimulus (Fig. 1a and Materials and methods). Stimuli comprised (combinations of) a LED and tone for the visual and auditory modality, respectively. Using hierarchical Bayesian inference (Kuss et al., 2005, Lee and Wagenmakers, 2013; see Materials and methods and Supplemental Methods for
Discussion
The main aim of this study was to investigate the electrophysiological dynamics of cross-sensory communication during the perception of multisensory temporal information. Based on previous literature (Hummel and Gerloff, 2005, Klemen and Chambers, 2012, Senkowski et al., 2008), we hypothesized that phase synchronization may be a key mechanism by which early sensory regions could be transiently coupled, thus providing temporal “windows” of cross-modal integration.
Our results are in line with the
Acknowledgments
J. v. D. and M. X. C. are, and the present work was, supported by a Vidi grant from the Netherlands Organisation for Scientific Research (NWO), awarded to M. X. C. (452-09-003). The authors thank Eric-Jan Wagenmakers and Ruud Wetzels for assistance in developing the Bayesian graphical model, and the reviewers, Marlies Vissers, and Richard Ridderinkhof for their critical comments on an earlier version of this manuscript.
Conflict of interestThe authors declare no competing financial interests.
References (107)
- et al.
The ventriloquist effect results from near-optimal bimodal integration
Curr. Biol.
(2004) - et al.
The “when” pathway of the right parietal lobe
Trends Cogn. Sci.
(2007) - et al.
Frontal theta links prediction errors to behavioral adaptation in reinforcement learning
NeuroImage
(2010) - et al.
EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis
J. Neurosci. Methods
(2004) - et al.
From local inhibition to long-range integration: a functional dissociation of alpha-band synchronization across cortical scales in visuospatial attention
Brain Res.
(2009) - et al.
Multisensory perception: beyond modularity and convergence
Curr. Biol.
(2000) - et al.
Temporal binding, binocular rivalry, and consciousness
Conscious. Cogn.
(1999) - et al.
Multisensory auditory–somatosensory interactions in early cortical processing revealed by high-density electrical mapping
Cogn. Brain Res.
(2000) A mechanism for cognitive dynamics: neuronal communication through neuronal coherence
Trends Cogn. Sci.
(2005)- et al.
Paving the way forward: integrating the senses through phase-resetting of cortical oscillations
Neuron
(2007)
Is neocortex essentially multisensory?
Trends Cogn. Sci.
The neural basis of event-time introspection
Conscious. Cogn.
Thalamo-cortical mechanisms underlying changes in amplitude and frequency of human alpha oscillations
NeuroImage
Auditory–visual temporal integration measured by shifts in perceived temporal location
Neurosci. Lett.
Principal components analysis of Laplacian waveforms as a generic method for identifying ERP generator patterns: I. Evaluation with auditory oddball tasks
Clin. Neurophysiol.
Current perspectives and methods in studying neural mechanisms of multisensory interactions
Neurosci. Biobehav. Rev.
EEG alpha oscillations: the inhibition-timing hypothesis
Brain Res. Rev.
Neuronal oscillations and multisensory interaction in primary auditory cortex
Neuron
The leading sense: supramodal control of neurophysiological context by attention
Neuron
Distinct systems for automatic and cognitively controlled time measurement: evidence from neuroimaging
Curr. Opin. Neurobiol.
Evoked potentials recorded from the auditory cortex in man: evaluation and topography of the middle latency components
Electroencephalogr. Clin. Neurophysiol.
Multisensory spatial interactions: a window onto functional integration in the human brain
Trends Neurosci.
Nonparametric statistical testing of EEG- and MEG-data
J. Neurosci. Methods
Region-specific modulations in oscillatory alpha activity serve to facilitate processing in the visual and auditory modalities
NeuroImage
Multisensory auditory–visual interactions during early sensory processing in humans: a high-density electrical mapping study
Cogn. Brain Res.
A cautionary note on the interpretation of phase-locking estimates with concurrent changes in power
Clin. Neurophysiol.
EEG coherency: I: statistics, reference electrode, volume conduction, Laplacians, cortical imaging, and interpretation at multiple scales
Electroencephalogr. Clin. Neurophysiol.
New vistas for alpha-frequency band oscillations
Trends Neurosci.
Spherical splines for scalp potential and current density mapping
Electroencephalogr. Clin. Neurophysiol.
Sounds reset rhythms of visual cortex and corresponding human visual perception
Curr. Biol.
Spatio-temporal frequency characteristics of intersensory components in audiovisually evoked potentials
Cogn. Brain Res.
Crossmodal binding through neural coherence: implications for multisensory processing
Trends Neurosci.
EEG and MEG coherence: measures of functional connectivity at distinct spatial scales of neocortical dynamics
J. Neurosci. Methods
An improved index of phase-synchronization for electrophysiological data in the presence of volume-conduction, noise and sample-size bias
NeuroImage
Audio-visual integration in temporal perception
Int. J. Psychophysiol.
Electrophysiological low-frequency coherence and cross-frequency coupling contribute to BOLD connectivity
Neuron
Synchronous neural oscillations and cognitive processes
Trends Cogn. Sci.
Top-down facilitation of visual recognition
Proc. Natl. Acad. Sci. U. S. A.
Neuronal mechanisms and attentional modulation of corticothalamic α oscillations
J. Neurosci.
What makes us tick? Functional and neural mechanisms of interval timing
Nat. Rev. Neurosci.
Auditory dominance over vision in the perception of interval duration
Exp. Brain Res.
Spontaneous EEG oscillations reveal periodic sampling of visual attention
Proc. Natl. Acad. Sci. U. S. A.
Analyzing Neural Time Series Data: Theory and Practice
Dynamic interactions between large-scale brain networks predict behavioral adaptation after perceptual errors
Cereb. Cortex
The duration of a co-occurring sound modulates visual detection performance in humans
PLoS ONE
Saccadic reaction times to audiovisual stimuli show effects of oscillatory phase reset
PLoS ONE
Asynchrony from synchrony: long-range gamma-band neural synchrony accompanies perception of audiovisual speech asynchrony
Exp. Brain Res.
Cross-modal links in spatial attention
Philos. Trans. R. Soc. Lond. B Biol. Sci.
The temporal cross-capture of audition and vision
Percept. Psychophys.
Auditory–visual integration during multimodal object recognition in humans: a behavioral and electrophysiological study
J. Cogn. Neurosci.
Cited by (51)
Distinctive features of experiential time: Duration, speed and event density
2024, Consciousness and CognitionAre you an empiricist or a believer? Neural signatures of predictive strategies in humans
2022, Progress in NeurobiologyCitation Excerpt :Crucially, we isolate different influences subserved by theta and alpha synchronization on the alpha amplitude recorded in posterior regions. Interregional alpha-band phase synchronization underpins numerous cognitive processes including top-down processing, perception, attention selection, cross-modal integration and working memory (Bastos et al., 2020; Doesburg et al., 2009; Michalareas et al., 2016; van Driel et al., 2014; van Kerkoerle et al., 2014; Zanto et al., 2011). Our results expand this literature by showing that alpha coupling is involved in the transmission of predictive-like information in human observers.
Crossmodal Associations and Working Memory in the Brain
2024, Advances in Experimental Medicine and Biology