Abstract
Natural movements, such as catching a ball or capturing prey, typically involve multiple senses. Yet, laboratory studies on human movements commonly focus solely on vision and ignore sound. Here we ask how visual and auditory signals are integrated to guide interceptive movements. Human observers tracked the brief launch of a simulated baseball, randomly paired with batting sounds of varying intensities, and made a quick pointing movement at the ball. Movement endpoints revealed systematic overestimation of target speed when ball launch was paired with a loud versus a quiet sound, even though sound was never informative. This effect was modulated by the availability of visual information: sounds biased interception when the visual presentation duration of the ball was short. Amplitude of the first catch-up saccade, occurring ∼125 ms after target launch, revealed early integration of audio-visual information for trajectory estimation. This sound-induced bias was reversed during later predictive saccades, when more visual information was available. Our findings suggest that auditory and visual signals are integrated to guide interception and that this integration process must occur early at a neural site that receives auditory and visual signals within an ultrashort time span.
Significance Statement
Almost all everyday actions, from catching a ball to driving a car, rely heavily on vision. Even though moving objects in our natural visual environment also make sounds, the influence of auditory signals on motor control is commonly ignored. This study investigates the effect of sound on vision-guided interception. We show that sound systematically biases interception movements, indicating that observers associate louder sounds with faster target speeds. Measuring eye movements during interception revealed that vision and sound are integrated rapidly and early in the sensory processing hierarchy. Training and rehabilitation approaches in sports and medicine could harness the finding that interceptive hand movements are driven by multisensory signals and not just vision alone.
Footnotes
The authors declare no conflicts of interest.
The authors thank Anna Montagnini and members of Spering lab for helpful comments.
P.K. and A.S. share first authorship.
This is an open-access article distributed under the terms of the Creative Commons Attribution 4.0 International license, which permits unrestricted use, distribution and reproduction in any medium provided that the original work is properly attributed.






Jump to comment: