Elsevier

NeuroImage

Volume 101, 1 November 2014, Pages 404-415
NeuroImage

Interregional alpha-band synchrony supports temporal cross-modal integration

https://doi.org/10.1016/j.neuroimage.2014.07.022Get rights and content

Highlights

  • We investigated EEG dynamics of cross-sensory integration during time perception

  • Psychometric functions were fitted on behavior using Bayesian graphical modeling

  • There was strong auditory dominance when subjects judged audiovisual duration

  • This correlated with audiovisual interregional alpha (8-12 Hz) phase synchrony

  • Single-trial alpha phase synchrony predicted the degree of psychometric lapsing

Abstract

In a continuously changing environment, time is a key property that tells us whether information from the different senses belongs together. Yet, little is known about how the brain integrates temporal information across sensory modalities. Using high-density EEG combined with a novel psychometric timing task in which human subjects evaluated durations of audiovisual stimuli, we show that the strength of alpha-band (8–12 Hz) phase synchrony between localizer-defined auditory and visual regions depended on cross-modal attention: during encoding of a constant 500 ms standard interval, audiovisual alpha synchrony decreased when subjects attended audition while ignoring vision, compared to when they attended both modalities. In addition, alpha connectivity during a variable target interval predicted the degree to which auditory stimulus duration biased time estimation while attending vision. This cross-modal interference effect was estimated using a hierarchical Bayesian model of a psychometric function that also provided an estimate of each individual's tendency to exhibit attention lapses. This lapse rate, in turn, was predicted by single-trial estimates of the stability of interregional alpha synchrony: when attending to both modalities, trials with greater stability in patterns of connectivity were characterized by reduced contamination by lapses. Together, these results provide new insights into a functional role of the coupling of alpha phase dynamics between sensory cortices in integrating cross-modal information over time.

Introduction

When the sound coming from a television gets out-of-sync with what is visually displayed, you immediately feel that something is wrong. Multimodal processing is ubiquitous in perception: different senses provide us with complementary evidence about external events, which can aid our responses to these events (McDonald et al., 2000, Yang et al., 2013), or can result in perceptual illusions (Alais and Burr, 2004, McGurk and MacDonald, 1976). Over the past several decades, neuroscience of multisensory processing has shifted from a strict hierarchical view of unisensory signals converging onto higher supramodal areas (Meredith and Stein, 1983, Stein and Stanford, 2008), to a growing consensus that cross-modal integration can occur even at early stages of sensory processing (Foxe et al., 2000, Ghazanfar and Chandrasekaran, 2007, Ghazanfar and Schroeder, 2006, Giard and Peronnet, 1999, Kayser and Logothetis, 2007, Martuzzi et al., 2007, Molholm et al., 2002). However, how these early-stage interactive processes are neurophysiologically organized remains a topic of active exploration (Klemen and Chambers, 2012, Sarko et al., 2013).

One proposed mechanism of neural interaction is “binding through coherence” (Fries, 2005, Varela et al., 2001, Ward, 2003, Womelsdorf et al., 2007), or the idea that effective windows of cortico-cortical communication may arise by transiently synchronized electrophysiological oscillations of the involved neural populations. Evidence is accumulating that this principle may apply to the integration of multisensory information as well (Doesburg et al., 2008, Hummel and Gerloff, 2005, Sarko et al., 2013, Senkowski et al., 2008, von Stein et al., 1999). For example, a stimulus of one modality can modulate the processing of a concurrently delivered stimulus of another modality, through the phase resetting of ongoing oscillatory activity in the corresponding primary sensory region (Diederich et al., 2012, Kayser et al., 2008, Lakatos et al., 2007). This results in increased cortical excitability, and thus improved behavioral performance towards the bimodal stimulus. Attention seems to determine which modality “controls” the phase-resetting (Lakatos et al., 2009). Given the tight link between alpha-band (8–12 Hz) activity and attentional processing (Jensen and Mazaheri, 2010, Klimesch et al., 2007), we hypothesized that during cross-modal attention, alpha phase synchrony may be an important mediator of large-scale communication between distant sensory regions (Hummel and Gerloff, 2005, Palva and Palva, 2011). Although several frequency bands have been implicated in multisensory integration (Senkowski et al., 2008), it has been proposed that phase synchronization in the alpha-band may be especially important for coordinating functional integration between cortical regions (Doesburg et al., 2009) of relatively longer inter-areal distances (Palva and Palva, 2007, Palva et al., 2005). This active role of alpha activity has been shown in a variety of cognitive and perceptual tasks, such as spatial attention (Doesburg et al., 2009), working memory (Palva et al., 2005), object recognition (Bar et al., 2006), and error-processing (van Driel et al., 2012), and may thus represent a general mechanism of coherent network functioning. Importantly, increases in interregional alpha-band phase synchrony can co-occur with local decreases in alpha-band power (Palva and Palva, 2007, Palva and Palva, 2011), where the latter may reflect attention-related “active inhibition” of task-irrelevant areas (Jensen and Mazaheri, 2010). In this study, we were in particular interested in the role of interregional alpha phase dynamics during cross-sensory integration.

Most studies on multisensory processing use brief, momentary stimuli, and focus on the spatial domain (Driver and Spence, 1998, Driver and Spence, 2000, Macaluso and Driver, 2005), or on judgments of successiveness versus simultaneity (Jaekl and Harris, 2007, Keetels and Vroomen, 2007). However, multisensory events in a continuous environment are more likely to be prolonged (Ghazanfar and Chandrasekaran, 2007), and are not necessarily linked to one spatial location; in these more naturalistic situations, correlated temporal durations can provide key evidence for integration. Moreover, it is especially interesting to study cross-modal integration of elapsed time, because the perception of auditory duration is superior to that of visual duration. This is in contrast to the more frequently investigated spatial domain, in which visual spatial localization is superior to auditory spatial localization (Burr et al., 2009, Fendrich and Corballis, 2001, Pick et al., 1969).

The purpose of the present study was to investigate the potential neural mechanisms of multimodal integration via duration perception. We here report novel evidence that inter-regional phase synchrony in the alpha-band supports multimodal duration judgments in humans. Through time–frequency decomposition of high-density EEG activity, we found that alpha synchrony was modulated by cross-modal attention and correlated with subject-specific Bayesian estimated parameters of distractor interference and lapsing.

Section snippets

Subjects

Nineteen subjects (age range 18–29, M = 22.4; 13 females) from the University of Amsterdam community participated in this study in exchange for €14 or course credits. All subjects signed an informed consent form before participation and reported to have normal or corrected-to-normal vision, and normal hearing. The study was approved by the local ethics committee; all procedures complied with relevant laws and institutional guidelines. Data of one subject were excluded from further analyses due to

Behavioral results

Nineteen human subjects performed a duration comparison task in which they judged whether the duration of a target stimulus (between 333 and 750 ms) was shorter or longer than that of a preceding 500 ms standard stimulus (Fig. 1a and Materials and methods). Stimuli comprised (combinations of) a LED and tone for the visual and auditory modality, respectively. Using hierarchical Bayesian inference (Kuss et al., 2005, Lee and Wagenmakers, 2013; see Materials and methods and Supplemental Methods for

Discussion

The main aim of this study was to investigate the electrophysiological dynamics of cross-sensory communication during the perception of multisensory temporal information. Based on previous literature (Hummel and Gerloff, 2005, Klemen and Chambers, 2012, Senkowski et al., 2008), we hypothesized that phase synchronization may be a key mechanism by which early sensory regions could be transiently coupled, thus providing temporal “windows” of cross-modal integration.

Our results are in line with the

Acknowledgments

J. v. D. and M. X. C. are, and the present work was, supported by a Vidi grant from the Netherlands Organisation for Scientific Research (NWO), awarded to M. X. C. (452-09-003). The authors thank Eric-Jan Wagenmakers and Ruud Wetzels for assistance in developing the Bayesian graphical model, and the reviewers, Marlies Vissers, and Richard Ridderinkhof for their critical comments on an earlier version of this manuscript.

Conflict of interestThe authors declare no competing financial interests.

References (107)

  • A.A. Ghazanfar et al.

    Is neocortex essentially multisensory?

    Trends Cogn. Sci.

    (2006)
  • A.G. Guggisberg et al.

    The neural basis of event-time introspection

    Conscious. Cogn.

    (2011)
  • R. Hindriks et al.

    Thalamo-cortical mechanisms underlying changes in amplitude and frequency of human alpha oscillations

    NeuroImage

    (2013)
  • P.M. Jaekl et al.

    Auditory–visual temporal integration measured by shifts in perceived temporal location

    Neurosci. Lett.

    (2007)
  • J. Kayser et al.

    Principal components analysis of Laplacian waveforms as a generic method for identifying ERP generator patterns: I. Evaluation with auditory oddball tasks

    Clin. Neurophysiol.

    (2006)
  • J. Klemen et al.

    Current perspectives and methods in studying neural mechanisms of multisensory interactions

    Neurosci. Biobehav. Rev.

    (2012)
  • W. Klimesch et al.

    EEG alpha oscillations: the inhibition-timing hypothesis

    Brain Res. Rev.

    (2007)
  • P. Lakatos et al.

    Neuronal oscillations and multisensory interaction in primary auditory cortex

    Neuron

    (2007)
  • P. Lakatos et al.

    The leading sense: supramodal control of neurophysiological context by attention

    Neuron

    (2009)
  • P.A. Lewis et al.

    Distinct systems for automatic and cognitively controlled time measurement: evidence from neuroimaging

    Curr. Opin. Neurobiol.

    (2003)
  • C. Liégeois-Chauvel et al.

    Evoked potentials recorded from the auditory cortex in man: evaluation and topography of the middle latency components

    Electroencephalogr. Clin. Neurophysiol.

    (1994)
  • E. Macaluso et al.

    Multisensory spatial interactions: a window onto functional integration in the human brain

    Trends Neurosci.

    (2005)
  • E. Maris et al.

    Nonparametric statistical testing of EEG- and MEG-data

    J. Neurosci. Methods

    (2007)
  • A. Mazaheri et al.

    Region-specific modulations in oscillatory alpha activity serve to facilitate processing in the visual and auditory modalities

    NeuroImage

    (2014)
  • S. Molholm et al.

    Multisensory auditory–visual interactions during early sensory processing in humans: a high-density electrical mapping study

    Cogn. Brain Res.

    (2002)
  • S.D. Muthukumaraswamy et al.

    A cautionary note on the interpretation of phase-locking estimates with concurrent changes in power

    Clin. Neurophysiol.

    (2011)
  • P.L. Nunez et al.

    EEG coherency: I: statistics, reference electrode, volume conduction, Laplacians, cortical imaging, and interpretation at multiple scales

    Electroencephalogr. Clin. Neurophysiol.

    (1997)
  • S. Palva et al.

    New vistas for alpha-frequency band oscillations

    Trends Neurosci.

    (2007)
  • F. Perrin et al.

    Spherical splines for scalp potential and current density mapping

    Electroencephalogr. Clin. Neurophysiol.

    (1989)
  • V. Romei et al.

    Sounds reset rhythms of visual cortex and corresponding human visual perception

    Curr. Biol.

    (2012)
  • O.W. Sakowitz et al.

    Spatio-temporal frequency characteristics of intersensory components in audiovisually evoked potentials

    Cogn. Brain Res.

    (2005)
  • D. Senkowski et al.

    Crossmodal binding through neural coherence: implications for multisensory processing

    Trends Neurosci.

    (2008)
  • R. Srinivasan et al.

    EEG and MEG coherence: measures of functional connectivity at distinct spatial scales of neocortical dynamics

    J. Neurosci. Methods

    (2007)
  • M. Vinck et al.

    An improved index of phase-synchronization for electrophysiological data in the presence of volume-conduction, noise and sample-size bias

    NeuroImage

    (2011)
  • Y. Wada et al.

    Audio-visual integration in temporal perception

    Int. J. Psychophysiol.

    (2003)
  • L. Wang et al.

    Electrophysiological low-frequency coherence and cross-frequency coupling contribute to BOLD connectivity

    Neuron

    (2012)
  • L.M. Ward

    Synchronous neural oscillations and cognitive processes

    Trends Cogn. Sci.

    (2003)
  • M. Bar et al.

    Top-down facilitation of visual recognition

    Proc. Natl. Acad. Sci. U. S. A.

    (2006)
  • A. Bollimunta et al.

    Neuronal mechanisms and attentional modulation of corticothalamic α oscillations

    J. Neurosci.

    (2011)
  • C.V. Buhusi et al.

    What makes us tick? Functional and neural mechanisms of interval timing

    Nat. Rev. Neurosci.

    (2005)
  • D. Burr et al.

    Auditory dominance over vision in the perception of interval duration

    Exp. Brain Res.

    (2009)
  • N.A. Busch et al.

    Spontaneous EEG oscillations reveal periodic sampling of visual attention

    Proc. Natl. Acad. Sci. U. S. A.

    (2010)
  • M.X. Cohen

    Analyzing Neural Time Series Data: Theory and Practice

    (2014)
  • M.X. Cohen et al.

    Dynamic interactions between large-scale brain networks predict behavioral adaptation after perceptual errors

    Cereb. Cortex

    (2012)
  • B. de Haas et al.

    The duration of a co-occurring sound modulates visual detection performance in humans

    PLoS ONE

    (2013)
  • A. Diederich et al.

    Saccadic reaction times to audiovisual stimuli show effects of oscillatory phase reset

    PLoS ONE

    (2012)
  • S.M. Doesburg et al.

    Asynchrony from synchrony: long-range gamma-band neural synchrony accompanies perception of audiovisual speech asynchrony

    Exp. Brain Res.

    (2008)
  • J. Driver et al.

    Cross-modal links in spatial attention

    Philos. Trans. R. Soc. Lond. B Biol. Sci.

    (1998)
  • R. Fendrich et al.

    The temporal cross-capture of audition and vision

    Percept. Psychophys.

    (2001)
  • M.H. Giard et al.

    Auditory–visual integration during multimodal object recognition in humans: a behavioral and electrophysiological study

    J. Cogn. Neurosci.

    (1999)
  • Cited by (51)

    • Are you an empiricist or a believer? Neural signatures of predictive strategies in humans

      2022, Progress in Neurobiology
      Citation Excerpt :

      Crucially, we isolate different influences subserved by theta and alpha synchronization on the alpha amplitude recorded in posterior regions. Interregional alpha-band phase synchronization underpins numerous cognitive processes including top-down processing, perception, attention selection, cross-modal integration and working memory (Bastos et al., 2020; Doesburg et al., 2009; Michalareas et al., 2016; van Driel et al., 2014; van Kerkoerle et al., 2014; Zanto et al., 2011). Our results expand this literature by showing that alpha coupling is involved in the transmission of predictive-like information in human observers.

    • Crossmodal Associations and Working Memory in the Brain

      2024, Advances in Experimental Medicine and Biology
    View all citing articles on Scopus
    View full text