Ocular tracking as a measure of auditory motion perception
Introduction
The goal of perceptual processing is to create an internal (neural) representation of the external environment. A primary dimension of most sensory representations is concerned with the specification of spatial location. This is certainly the case for vision and audition, for it is indeed difficult to imagine a visual or auditory percept that does not include a localization component.
A great deal of neural circuitry is devoted to performing the computations needed to localize visual and auditory stimuli. In fact, the most prominent theory of the neural mechanisms of visual processing hypothesizes that an extensive network of cortical areas extending throughout the occipital and parietal lobes is concerned primarily with coding visuo-spatial information [67]. One reason why the spatial dimension of sensory processing is so important is that spatial representations are needed to guide movements. Thus, it is not surprising that much of the visuo-spatial processing that occurs in the `dorsal processing stream' is concerned with the transformation of visuo-spatial information into pre-motor commands relating to reaching and the control of eye movements (e.g., [19], [20], [37], [50]). Auditory spatial information also is distributed to portions of this dorsal processing stream where a similar transformation into pre-motor signals appears to occur (e.g., [13], [15], [39], [55], [69], [73]).
In humans, psychophysical evidence suggests that the neural representation of visual space is more elaborated and precise than the representation of the acoustic space. Consider for example vernier acuity, which is the finest measure of visual localization (e.g., [70], [71]). In a typical vernier acuity task, an observer is required to detect a spatial misalignment of two visual contours, as is illustrated in Fig. 1. The threshold for such judgments corresponds to retinal offsets that are substantially smaller than the diameter of individual foveal cones. Under optimal conditions, vernier acuity thresholds are approximately 5 s of visual arc (0.0014°). By this standard, auditory localization is fairly coarse. The minimum audible angle (MAA), which is the finest measure of auditory spatial offset, is slightly greater than 1° of azimuth [49]. Thus, the threshold for spatial offsets in the frontal plane is about 700-times greater for auditory stimuli than visual stimuli. Consistent with these psychophysical data are the observations that saccadic eye movements to auditory stimuli are not as accurate as visually-guided saccades (e.g., [22], [32], [35]).
The manner in which auditory position information is distributed to the oculomotor system appears to be designed specifically to permit poly-modal access to the saccadic control system [35]. Consistent with this idea is the observation that saccades to bimodal targets (auditory + visual) are faster than saccades to auditory stimuli alone or visual stimuli alone. The magnitude of facilitation cannot be accounted for by probability summation (the gain produced by two independent detection processes––one for vision and one for audition [32], [33]). As a result, inter-sensory facilitation of saccades has been interpreted as a psychophysical correlate of the convergence of auditory and visual inputs onto single neurons that is known to occur within the superior colliculus (e.g., [18], [36], [47]).
While the saccade data indicate that neural representations of static locations of auditory and visual stimuli are distributed to neural centers that generate saccadic eye movements (or reaches), it is unclear whether analogous circuits exist for auditory and visual motion. Moreover, if such circuits do exist, is there poly-modal facilitation comparable to that seen in saccadic eye movements?
To address these questions, we compared smooth pursuit eye movements (SPEM) for moving auditory and visual stimuli. SPEMs are classically associated with tracking of a moving visual stimulus. Consequently, these movements typically require a neural signal specifying target velocity (e.g., [10], [61]). We reasoned that, since SPEMs rely on motion signals, any evidence of SPEMs based on auditory motion would imply that the human auditory system extracts information specifying stimulus velocity and distributes that information to the slow pursuit control system. If SPEMs are elicited by auditory motion, we may conclude that information about auditory motion is coded somewhere within the central auditory pathways and transmitted to the smooth pursuit system.
If, however, ocular tracking of auditory motion is not possible, we are left with two possible explanations. It may be that the auditory system generates the relevant motion signals, but those signals do not gain access to the pursuit control system. Alternatively, a failure to observe SPEMs elicited by auditory motion might be attributed to an absence of motion detectors in the human auditory system. It is this latter possibility that is the primary focus of this article.
We evaluated the quality of ocular tracking in response to two types of motion: motion in the frontal plane and motion in depth. Motion in the frontal plane produces conjunctive SPEMs, whereas motion in depth produces disjunctive SPEMs, or vergence eye movements. In the case of visually-guided pursuit, it is thought that different mechanisms drive conjunctive and disjunctive SPEMs––stereoscopic cues are considered essential for disjunctive SPEMs, and motion cues are considered essential for conjunctive SPEMs. Based on the evidence that each type of movement relies on different visual cues and different motor systems (cf., [10]), we decided to study possible auditory control over each type of movement.
Section snippets
Ocular pursuit of auditory motion in the frontal plane
We begin with an examination of ocular tracking of auditory motion in the frontal plane for several reasons. First, the processes that support auditory localization in the frontal plane are reasonably well understood (cf., [6], [14], [21], [28], [48], [56], [72]). The eccentricity of a sound source is computed through comparisons of the auditory signals arriving at each ear. Since the classic analysis by Lord Rayleigh [56], it has been recognized that two distinct binaural cues support sound
Methods
The results from Experiment 1 indicate that ocular tracking of auditory motion is accomplished almost exclusively by generating a succession of saccades. These observations confirm and extend earlier evidence that auditory motion is incapable of supporting smooth pursuit eye movements. However, it is more than a matter of poor smooth pursuit of auditory motion: the quality of ocular tracking of auditory motion was statistically indistinguishable from attempts to track imagined motion.
General discussion
The subjective impression that we experience auditory motion in much the same manner as we experience visual motion can be quite compelling. Yet objective data lending support to this phenomenology has not been forthcoming. To our knowledge, neurons that are specifically responsive to moving sound sources have not been reported, although not because of a lack of effort. A variety of subtle effects have been described that could be a component operation in the computation of auditory motion [1],
Acknowledgements
This work was supported by in part by NIH NS17778. The authors thank the Fondation des Treilles for their generous support of the conference which formed the basis for this issue of the Journal of Physiology, Paris.
References (72)
A short-range process in apparent motion
Vision Res
(1974)- et al.
Maps versus clusters: Different representations of auditory space in the midbrain and forebrain
Trends Neurosci
(1999) - et al.
Who goes there?
Neuron
(1999) - et al.
Auditory-evoked saccades in two dimensions: dynamical characteristics, influence of eye position, and sound source spectrum
- et al.
Spatial characteristics of visual-auditory summation in human saccades
Vision Res
(1998) Slow eye movements
Prog. Neurobiol
(1997)- et al.
Grasping objects: the cortical mechanisms of visuomotor transformation
Trends Neurosci
(1995) - et al.
Auditory cortical neurons in the cat sensitive to the direction of sound source movement
Brain Res
(1974) - et al.
On the mechanism that encodes the movement of contrast variations: Velocity discrimination
Vision Res
(1989) - et al.
Perception of sound-source motion by the human brain
Neuron
(2002)
Spatial configurations for visual hyperacuity
Vision Res
Encoding of sound-source location and movement: Activity of single neurons and interactions between adjacent neurons in the monkey auditory cortex
J. Neurophysiol
Infant responses to impending collision: Optical and real
Science
Summation and inhibition in the frog's retina
J. Physiol. (Lond.)
Selective sensitivity to direction of movement in ganglion cells of the rabbit retina
Science
A movement-sensitive area in auditory cortex
Nature
An introduction to binaural technology
Infant responses to approaching objects. An indicator of response to distal variables
Percept. Psychophys
Landing reaction of Musca domestica induced by visual stimuli
Naturwissenschaften
Movements of the Eyes
Motion: the long and short of it
Spatial Vision
Drift-balanced random stimuli: A general basis for studying non-Fourier motion perception
JOSA
A common reference frame for movement plans in the posterior parietal cortex
Nat. Rev. Neurosci
A contingent aftereffect in the auditory sytem
Nat. Neurosci
The auditory motion aftereffect: its tuning and specificity in the spatial and frequency domains
Percept. Psychophys
Physiology of visual cells in mouse superior colliculus and correlation with somatosensory and auditory input
Nature
Responses of monkey MST neurons to optic flow stimuli with shifted centers of motion
J. Neurosci
The updating of the representation of visual space in parietal cortex by intended eye movements
Science
Localization of high-frequency tones
J. Acoust. Soc. Am
Eye tracking of self-moved targets in the absence of vision
Exp. Brain Res
Eye movements in response to real and apparent motions of acoustic targets
Percept. Motor Skills
Hans Gertz revisited: the different effects of invisibility and darkness of pursuit eye movements
Perception
Motion aftereffects with horizontally moving sound sources in the free field
Percept. Psychophys
Auditory motion aftereffects
Percept. Psychophys
An Introduction to Hearing
Human brain areas involved in the analysis of auditory movement
Hum. Brain Map
Cited by (19)
Sensorimotor control of standing balance
2018, Handbook of Clinical NeurologyCitation Excerpt :Auditory cues can induce illusion of self-motion (Urbantschitsch, 1897) and audiokinetic nystagmus (Dodge, 1923), similar (but weaker) to visual cues (Marme-Karelse and Bles, 1977; Tanahashi et al., 2015; for review, see Väljamäe, 2009 and Campos et al., 2018). For example, psychophysical testing of visual and auditory localization demonstrates a 0.0014° threshold during visual (Vernier) acuity tests and 1° threshold during minimum audible angle tests (Mills, 1958; Westheimer and McKee, 1977; Boucher et al., 2004). Sound intensity, interaural time, and interaural level differences, as well as Doppler effects, potentially all contribute to our sense of auditory motion (for review on auditory motion perception, see Carlile and Leung, 2016).
Do we track what we see? Common versus independent processing for motion perception and smooth pursuit eye movements: A review
2011, Vision ResearchCitation Excerpt :p. 1371). Interestingly, tracking of non-visual objects such as auditory, tactile or proprioceptive targets with pursuit eye movements has been reported as very poor, with particularly low velocity gains in response to moving auditory stimuli (Berryhill, Chiu, & Hughes, 2006; Boucher, Lee, Cohen, & Hughes, 2004; Gauthier & Hofferer, 1976). These findings exclude a comparison between perception and pursuit, even though motion perception of non-visual targets can be intact.
Using hardware models to quantify sensory data acquisition across the rat vibrissal array
2007, Bioinspiration and BiomimeticsHow the eyes respond to sounds
2024, Annals of the New York Academy of Sciences