Elsevier

NeuroImage

Volume 21, Issue 1, January 2004, Pages 125-135
NeuroImage

Rapid discrimination of visual and multisensory memories revealed by electrical neuroimaging

https://doi.org/10.1016/j.neuroimage.2003.09.035Get rights and content

Abstract

Though commonly held that multisensory experiences enrich our memories and that memories influence ongoing sensory processes, their neural mechanisms remain unresolved. Here, electrical neuroimaging shows that auditory–visual multisensory experiences alter subsequent processing of unisensory visual stimuli during the same block of trials at early stages poststimulus onset and within visual object recognition areas. We show this with a stepwise analysis of scalp-recorded event-related potentials (ERPs) that statistically tested (1) ERP morphology and amplitude, (2) global electric field power, (3) topographic stability of and changes in the electric field configuration, and (4) intracranial distributed linear source estimations. Subjects performed a continuous recognition task, discriminating repeated vs. initial image presentations. Corresponding, but task-irrelevant, sounds accompanied half of the initial presentations during a given block of trials. On repeated presentations within a block of trials, only images appeared, yielding two situations—the image's prior presentation was only visual or with a sound. Image repetitions that had been accompanied by sounds yielded improved memory performance accuracy (old or new discrimination) and were differentiated as early as ∼ 60–136 ms from images that had not been accompanied by sounds through generator changes in areas of the right lateral–occipital complex (LOC). It thus appears that unisensory percepts trigger multisensory representations associated with them. The collective data support the hypothesis that perceptual or memory traces for multisensory auditory–visual events involve a distinct cortical network that is rapidly activated by subsequent repetition of just the unisensory visual component.

Introduction

Neurophysiological investigation of when and where in the brain one's memories or past experiences first affect responses to incoming stimuli has predominantly focused on the influence of unisensory memories; that is, stimulation of one sensory modality later influencing the responses to stimuli within the same modality. For example, studies of repetition priming have shown that behavior and brain responses change with repeated exposure to the same or similar stimuli (e.g., Doniger et al., 2001, Wiggs and Martin, 1998 for a recent review). More recently, investigations have begun examining how experiences in one or multiple senses alter later processing of stimuli of another sensory modality. These studies provide evidence that brain regions involved in an experience's encoding are also involved during its subsequent active retrieval (e.g., James et al., 2002, Nyberg et al., 2000, Wheeler et al., 2000). Subjects learned auditory–visual or visual–visual associations during a study session and later classified visual stimuli according to the sensory modality with which it initially appeared Nyberg et al., 2000, Wheeler et al., 2000. That auditory regions were active in response to visual stimuli that were first presented with sounds was taken as support for the theory of ‘redintegration’ (Hamilton, 1859), wherein a component part is sufficient to (re)activate the whole experience's consolidated representation. Intracranial microelectrode recordings in monkeys provide similar evidence by demonstrating selective delay activity during a delayed match-to-sample task with visual–visual, somatosensory–visual, and auditory–visual paired associates (e.g., Colombo and Gross, 1994, Gibson and Maunsell, 1997, Haenny et al., 1988, Maunsell et al., 1991). In these studies, responses were elicited in cortical areas involved in visual object recognition (i.e., areas V4 and IT) by nonvisual stimuli and were selective for specific associations among the learned set. One implication of these collective data is that prior multisensory experiences can influence and be part of memory functions such that when an association is formed between stimuli of different modalities, presentation of one stimulus can alter the activity in regions typically implicated in the processing of the modality of the other stimulus. That is, responses to an incoming stimulus may vary, either in terms of their pattern within a region or overall activated network, according to whether it is part of a multisensory or unisensory memory. However, it is not clear where or when (either in terms of time poststimulus or in terms of levels of processing) such effects first occur. These previous studies either lacked adequate temporal resolution or had limited spatial sampling to provide such information. Identification of the earliest effects can be used to place critical limits on the mechanisms of multisensory memory retrieval. It is similarly unclear whether such effects depend either on extensive training with the stimulus associations or on active classification of stimuli according to past experiences.

The aim of the present study was to determine the time course and initial locus of incidental effects of past multisensory experiences on current unisensory processes when subjects neither studied (through prolonged or repeated exposure) multisensory image–sound pairs nor later classified images according to the sense(s) initially stimulated. We applied high-density electrical neuroimaging techniques to this aim. We show that visual stimuli are rapidly differentiated according to their multisensory or unisensory association as early as ∼ 60–136 ms poststimulus onset in regions of the right lateral–occipital complex (LOC), a system of areas of the ventral visual cortical pathway involved in object recognition (e.g., Malach et al., 1995, Murray et al., 2002). This observation indicates that multisensory memories first alter visual sensory responses at early processing stages and provides evidence of the functional efficacy of prior multisensory experiences.

Section snippets

Subjects

Eleven (three female) paid volunteers aged 18–28 years (mean ± SD = 23.6 ± 3.4 years) provided written, informed consent to participate in the experiment, the procedures of which were approved by the Ethical Committee of the University Hospital of Geneva. All subjects were right-handed (Edinburgh questionnaire; Oldfield, 1971), had no neurological or psychiatric illnesses, had normal or corrected-to-normal vision, and reported normal hearing.

Stimuli and procedure

Subjects performed a continuous recognition task

Results

We restricted our analyses to responses to the repeated presentations (i.e., the V+ and V− conditions), reasoning that differences between responses to these stimuli reveal brain mechanisms of incidental multisensory memory discrimination. Subjects readily discriminated between initial and repeated presentations of images (overall accuracy across all four conditions = 91 ± 1%). More precisely, the subjects' accuracy was significantly higher for stimuli from the V+ than the V− condition (88.5%

Discussion

The present study examined the time course and initial locus of incidental effects of past multisensory experiences on current unisensory responses when subjects neither explicitly studied multisensory associations nor later classified stimuli according to these associations. Both the behavioral and electrophysiological data provided evidence for such effects. Image repetitions during a continuous recognition task were more accurately discriminated if they had initially been presented with a

Acknowledgements

We thank Olaf Blanke for discussions and comments on the manuscript and Christine Ducommun for technical assistance with auditory stimuli. We are also extremely grateful for the detailed and helpful commentaries of two anonymous reviewers. The Swiss National Science Foundation (grants #3238-62769.00 to A.S. and 3234-069264.02 to S.G.A.) provided financial support.

References (78)

  • A. Lavric et al.

    A double-dissociation of English past-tense production revealed by event-related potentials and low-resolution electromagnetic tomography (LORETA)

    Clin. Neurophysiol.

    (2001)
  • D. Lehmann et al.

    Reference-free identification of components of checkerboard-evoked multichannel potential fields

    Electroencephalogr. Clin. Neurophysiol.

    (1980)
  • B. Lutkenhoner et al.

    Magnetoencephalographic correlates of audiotactile interaction

    NeuroImage

    (2002)
  • E. Macaluso et al.

    Crossmodal spatial influences of touch on extrastriate visual areas take current gaze direction into account

    Neuron

    (2002)
  • C.M. Michel et al.

    Electric source imaging of human brain functions

    Brain Res., Brain Res. Rev.

    (2001)
  • S. Molholm et al.

    Multisensory auditory–visual interactions during early sensory processing in humans: a high-density electrical mapping study

    Brain Res., Cogn. Brain Res.

    (2002)
  • R.C. Oldfield

    The assessment and analysis of handedness: the Edinburgh Inventory

    Neuropsychologia

    (1971)
  • F. Perrin et al.

    Mapping of scalp potentials by surface spline interpolation

    Electroencephalogr. Clin. Neurophysiol.

    (1987)
  • D. Pizzagalli et al.

    Face-elicited ERPs and affective attitude: brain electric microstate and tomography analyses

    Clin. Neurophysiol.

    (2000)
  • D.A. Pizzagalli et al.

    Affective judgments of faces modulate early activity (approximately 160 ms) within the fusiform gyri

    NeuroImage

    (2002)
  • T. Raij et al.

    Audiovisual integration of letters in the human brain

    Neuron

    (2000)
  • K.S. Rockland et al.

    Multisensory convergence in calcarine visual areas in macaque monkey

    Int. J. Psychophysiol.

    (2003)
  • C.E. Schroeder et al.

    Anatomical mechanisms and functional implications of multisensory convergence in early cortical processing

    Int. J. Psychophysiol.

    (2003)
  • M.R. Stoesz et al.

    Neural networks active during tactile form perception: common and differential activity during macrospatial and microspatial tasks

    Int. J. Psychophysiol.

    (2003)
  • D. Tsivilis et al.

    Context effects on the neural correlates of recognition memory: an electrophysiological study

    Neuron

    (2001)
  • C.L. Wiggs et al.

    Properties and mechanisms of perceptual priming

    Curr. Opin. Neurobiol.

    (1998)
  • A. Amedi et al.

    Visuo-haptic object-related activation in the ventral visual pathway

    Nat. Neurosci.

    (2001)
  • A. Amedi et al.

    Convergence of visual and tactile shape processing in the human lateral occipital complex

    Cereb. Cortex

    (2002)
  • S. Braeutigam et al.

    Task-dependent early latency (30–60 ms) visual processing of human faces and other objects

    NeuroReport

    (2001)
  • D. Brandies et al.

    Segments of event-related potential map series reveal landscape changes with visual attention and subjective contours

    Electroencephalogr. Clin. Neurophysiol.

    (1989)
  • G.A. Calvert et al.

    Response amplification in sensory-specific cortices during cross-modal binding

    NeuroReport

    (1999)
  • M. Colombo et al.

    Responses of inferior temporal cortex and hippocampal neurons during delayed matching to sample in monkeys (Macaca fascicularis)

    Behav. Neurosci.

    (1994)
  • J.B. Debruille et al.

    ERPs and chronometry of face recognition: following-up Seeck et al. and George et al

    NeuroReport

    (1998)
  • E. Deibert et al.

    Neural pathways in tactile object recognition

    Neurology

    (1999)
  • E. Donchin et al.

    Cognitive psychophysiology: the endogenous components of the ERP

  • G.M. Doniger et al.

    Activation timecourse of ventral visual stream object-recognition areas: high density electrical mapping of perceptual closure processes

    J. Cogn. Neurosci.

    (2000)
  • A. Falchier et al.

    Anatomical evidence of multimodal integration in primate striate cortex

    J. Neurosci.

    (2002)
  • D.H. Fender

    Source localisation of brain electrical activity

  • A. Fort et al.

    Dynamics of cortico-subcortical cross-modal operations involved in audio–visual object detection in humans

    Cereb. Cortex

    (2002)
  • Cited by (213)

    • Enriched learning: behavior, brain, and computation

      2023, Trends in Cognitive Sciences
      Citation Excerpt :

      The viewing of tool-related actions elicits more pronounced alpha oscillatory desynchronization over the visual cortex if the actions were learned through self-performance rather than through viewing [77]. Similarly, brain responses to pictures that have been learned with accompanying sounds are associated with enhanced activity in the LOC occurring within 60 ms of picture onsets [78,79]. However, other studies report no altered responses of unisensory cortices after enriched learning (e.g., [23]).

    View all citing articles on Scopus
    View full text