Research ReportImprovement of visual contrast detection by a simultaneous sound
Introduction
During everyday experience, auditory and visual stimuli are not separated into independent modalities but usually appear in close coordination. A snake wriggling through the grass makes a typical rustling sound, and thunderstorms impress both by lightning and thunder. In general, combining sensory information can enhance perceptual clarity and reduce ambiguity about the sensory environment (Ernst and Bulthoff, 2004, Stein and Meredith, 1993). For example, it has been demonstrated that combined sensory information can speed reaction times (Gielen et al., 1983, Hershenson, 1962, Posner et al., 1976), facilitate learning (Seitz et al., 2006) and change the qualitative sensory experience (Jousmaki and Hari, 1998, McGurk and MacDonald, 1976, Shams et al., 2000). Although many of these cross-modal phenomena are attributed to high-level cognitive processes, others are thought to arise from early and hard-wired sensory integration (Stein, 1998).
In particular, such early sensory integration is thought to mediate cross-modal improvement of low-level detection tasks. For example, a simultaneous tone improved detection of a dimly flashed light (Frassinetti et al., 2002a, Frassinetti et al., 2002b, McDonald et al., 2000, Teder-Salejarvi et al., 2005), enhanced the discriminability of briefly flashed visual patterns (Vroomen and de Gelder, 2000) or increased the perceived luminance of light (Stein et al., 1996). While these studies suggest that early sensory integration serves as basis for the improved visual performance, other studies propose that the observed effects result from biases of the cognitive decision process related to the particular paradigms employed (Doyle and Snowden, 2001, Odgaard et al., 2003).
One could conceive that cross-connections between early sensory areas, as for example demonstrated from the auditory to the visual cortex (Falchier et al., 2002, Rockland and Ojima, 2003), facilitate processing in one sense by input from another. It could also be that the superior colliculus, a subcortical structure containing many neurons responding to bi- or trimodal stimuli, is mediating cross-modal improvements in simple detection tasks (Stein, 1988, Stein and Meredith, 1993). However, many behavioral protocols used previously do not allow clear dissociation between early sensory integration and cognitive effects related to changes in decision making (Odgaard et al., 2003). For example, subjects could explicitly combine the information they gather from each sense and adjust their behavioral strategy depending on whether or not it seems advantageous on a cognitive level.
To address this controversy, we systematically quantified the effect of a simultaneous sound on a contrast detection task. We compared different paradigms based on the following reasoning: An early and automatic auditory influence on vision should occur regardless of whether the sound provides additional information about the visual stimulus or is redundant with the visual display. In addition, such an influence should not depend on the subjects' knowledge about the informative relation between both stimuli. A cognitive effect, however, should manifest only when the sound provides additional information over the visual stimulus, and even then, only when subjects are aware of the additional information.
To distinguish between these two possibilities, we manipulated the informative content of the sound. In different paradigms the temporal uncertainty of the visual stimulus was reduced by either the sound (“informative sound”), or by a visual cue that appeared simultaneously with target and which made the sound redundant (“redundant sound”). Additionally, we manipulated the subjects' knowledge about the informative content of the sound by randomizing the stimulus onset asynchrony. Our results demonstrate that a behavioral benefit of the sound occurs only in the “informative sound” condition, and only when the sound has a reliable and fixed timing relative to the visual target.
Section snippets
Redundant sounds do not improve visual detection
We measured contrast detection curves in a paradigm where the timing of the visual target varied randomly from trial to trial. On half the trials, a sound was presented in synchrony with the target, informing the subject about the time point of target presentation. In the first experiment (Fig. 1A), an additional visual cue also indicated the timing of the target and rendered the sound uninformative, i.e. redundant with the visual display. Comparing the subjects' performance on trials with and
Discussion
Cross-modal interactions between sensory systems can occur at many stages of sensory processing and by virtue of several mechanisms. Previous studies proposed that early and hard-wired sensory interactions form the basis of improved performance in low-level visual tasks such as the detection of briefly presented targets or judgments of perceived brightness (Frassinetti et al., 2002a, Frassinetti et al., 2002b, Stein et al., 1996, Teder-Salejarvi et al., 2005, Vroomen and de Gelder, 2000). Our
Experimental procedures
Volunteer subjects (20–36 years, both sexes) received a financial compensation for their participation and gave written informed consent before the experiment. All had normal or corrected-to-normal vision and were naïve about the aim of the study.
Experiments were conducted in a dark and sound-attenuated room. Stimulus presentation and data acquisition were controlled using a real-time operating system (QNX, QNX Software Systems Ltd). Visual stimuli were presented on a CRT-monitor (19 in., 85 Hz
Acknowledgment
This study was supported by the Max Planck Society.
References (35)
- et al.
Crossmodal attention
Curr. Opin. Neurobiol.
(1998) - et al.
Multisensory perception: beyond modularity and convergence
Curr. Biol.
(2000) - et al.
Merging the senses into a robust percept
Trends Cogn. Sci.
(2004) - et al.
Parchment-skin illusion: sound-biased touch
Curr. Biol.
(1998) - et al.
An irrelevant light enhances auditory detection in humans: a psychophysical analysis of multisensory integration in stimulus detection
Brain Res. Cogn. Brain Res.
(2003) - et al.
Cross-modal interactions in auditory and visual discrimination
Int. J. Psychophysiol.
(2003) - et al.
Attentional modulation strength in cortical area mt depends on stimulus contrast
Neuron
(2002) - et al.
Attention increases sensitivity of v4 neurons
Neuron
(2000) - et al.
Multisensory convergence in calcarine visual areas in macaque monkey
Int. J. Psychophysiol.
(2003) - et al.
Sound facilitates visual learning
Curr. Biol.
(2006)
Superior colliculus-mediated visual behaviors in cat and the concept of two corticotectal systems
Prog. Brain Res.
Attentional effects on contrast detection in the presence of surround masks
Vis. Res.
Identification of visual stimuli is improved by accompanying auditory stimuli: the role of eye movements and sound location
Perception
Anatomical evidence of multimodal integration in primate striate cortex
J. Neurosci.
Enhancement of visual perception by crossmodal visuo-auditory interaction
Exp. Brain Res.
Acoustical vision of neglected stimuli: Interaction among spatially converging audiovisual inputs in neglect patients
J. Cogn. Neurosci.
On the nature of intersensory facilitation of reaction time
Percept. Psychophys.
Cited by (149)
Cortical cellular encoding of thermotactile integration
2024, Current BiologyHow do irrelevant stimuli from another modality influence responses to the targets in a same-different task
2023, Consciousness and CognitionEffects of stimulus intensity on audiovisual integration in aging across the temporal dynamics of processing
2021, International Journal of Psychophysiology