Abstract
Many studies demonstrated a higher accuracy in perception and action when using more than one sense. The maximum-likelihood estimation (MLE) model offers a recent approach on how perceptual information is integrated across different sensory modalities suggesting statistically optimal integration. The purpose of the present study was to investigate how visual and proprioceptive movement information is integrated for the perception of trajectory geometry. To test this, participants sat in front of an apparatus that moved a handle along a horizontal plane. Participants had to decide whether two consecutive trajectories formed an acute or an obtuse movement path. Judgments had to be based on information from a single modality alone, i.e., vision or proprioception, or on the combined information of both modalities. We estimated both the bias and variance for each single modality condition and predicted these parameters for the bimodal condition using the MLE model. Consistent with previous findings, variability decreased for perceptual judgments about trajectory geometry based on combined visual-proprioceptive information. Furthermore, the observed bimodal data corresponded well to the predicted parameters. Our results suggest that visual and proprioceptive movement information for the perception of trajectory geometry is integrated in a statistically optimal manner.
Similar content being viewed by others
References
Alais D, Burr D (2004a) No direction-specific bimodal facilitation for audiovisual motion detection. Brain Res Cogn Brain Res 19:185–194
Alais D, Burr D (2004b) The ventriloquist effect results from near-optimal bimodal integration. Curr Biol 14:257–262
Alink A, Singer W, Muckli L (2008) Capture of auditory motion by vision is represented by an activation shift from auditory to visual motion cortex. J Neurosci 28:2690–2697
Amedi A, von Kriegstein K, van Atteveldt NM, Beauchamp MS, Naumer MJ (2005) Functional imaging of human crossmodal identification and object recognition. Exp Brain Res 166:559–571
Beauchamp MS, Lee KE, Argall BD, Martin A (2004) Integration of auditory and visual information about objects in superior temporal sulcus. Neuron 41:809–823
Blake R, Sobel KV, James TW (2004) Neural synergy between kinetic vision and touch. Psychol Sci 15:397–402
Bremmer F, Schlack A, Shah NJ, Zafiris O, Kubischik M, Hoffmann K, Zilles K, Fink GR (2001) Polymodal motion processing in posterior parietal and premotor cortex: a human fMRI study strongly implies equivalencies between humans and monkeys. Neuron 29:287–296
Calvert GA, Hansen PC, Iversen SD, Brammer MJ (2001) Detection of audio–visual integration sites in humans by application of electrophysiological criteria to the BOLD effect. Neuroimage 14:427–438
Chen S, Levi DM (1996) Angle judgement: is the whole the sum of its parts? Vision Res 36:1721–1735
Drewing K, Ernst MO (2006) Integration of force and position cues for shape perception through active touch. Brain Res 1078:92–100
Drewing K, Wiecki TV, Ernst MO (2008) Material properties determine how force and position signals combine in haptic shape perception. Acta Psychol (Amst) 128:264–273
Duhamel JR, Colby CL, Goldberg ME (1998) Ventral intraparietal area of the macaque: congruent visual and somatic response properties. J Neurophysiol 79:126–136
Ernst MO, Banks MS (2002) Humans integrate visual and haptic information in a statistically optimal fashion. Nature 415:429–433
Ernst MO, Bülthoff HH (2004) Merging the senses into a robust percept. Trends Cogn Sci 8:162–169
Fiehler K, Burke M, Engel A, Bien S, Rösler F (2008) Kinesthetic working memory and action control within the dorsal stream. Cereb Cortex 18:243–253
Fisher NI (1993) Statistical Analysis of Circular Data. Cambridge University Press, Cambridge
Gepshtein S, Banks MS (2003) Viewing geometry determines how vision and haptics combine in size perception. Curr Biol 13:483–488
Gepshtein S, Burge J, Ernst MO, Banks MS (2005) The combination of vision and touch depends on spatial proximity. J Vis 5:1013–1023
Gobbelé R, Schürmann M, Forss N, Juottonen K, Buchner H, Hari R (2003) Activation of the human posterior parietal and temporoparietal cortices during audiotactile interaction. Neuroimage 20:503–511
Gondan M, Niederhaus B, Rösler F, Röder B (2005) Multisensory processing in the redundant-target effect: a behavioral and event-related potential study. Percept Psychophys 67:713–726
Grefkes C, Fink GR (2005) The functional organization of the intraparietal sulcus in humans and monkeys. J Anat 207:3–17
Gu Y, Angelaki DE, DeAngelis GC (2008) Neural correlates of multisensory cue integration in macaque MSTd. Nat Neurosci 11:1201–1210
Helbig HB, Ernst MO (2007) Optimal integration of shape information from vision and touch. Exp Brain Res 179:595–606
Helbig HB, Ernst MO (2008) Visual-haptic cue weighting is independent of modality-specific attention. J Vis 8:21.1–21.6
Hillis JM, Watt SJ, Landy MS, Banks MS (2004) Slant from texture and disparity cues: optimal cue combination. J Vis 4:967–992
Jammalamadaka SR, Sengupta A (2001) Topics in circular statistics. World Scientific Press, Singapore
Kennedy GJ, Orbach HS, Loffler G (2006) Effects of global shape on angle discrimination. Vision Res 46:1530–1539
Kesten H (1958) Accelerated stochastic approximation. Ann Math Stat 29:41–59
Knill DC, Saunders JA (2003) Do humans optimally integrate stereo and texture information for judgments of surface slant? Vision Res 43:2539–2558
Lakatos S, Marks LE (1998) Haptic underestimation of angular extent. Perception 27:737–754
Landy MS, Maloney LT, Johnston EB, Young M (1995) Measurement and modeling of depth cue combination: in defense of weak fusion. Vision Res 35:389–412
Macaluso E, Driver J, Frith CD (2003) Multimodal spatial representations engaged in human parietal cortex during both saccadic and manual spatial orienting. Curr Biol 13:990–999
Martuzzi R, Murray MM, Michel CM, Thiran JP, Maeder PP, Clarke S, Meuli RA (2007) Multisensory interactions within human primary cortices revealed by BOLD dynamics. Cereb Cortex 17:1672–1679
Meyer GF, Wuerger SM, Röhrbein F, Zetzsche C (2005) Low-level integration of auditory and visual motion signals requires spatial co-localisation. Exp Brain Res 166:538–547
Oldfield RC (1971) The assessment and analysis of handedness: the Edinburgh inventory. Neuropsychologia 9:97–113
Regan D, Gray R, Hamstra SJ (1996) Evidence for a neural mechanism that encodes angles. Vision Res 36:323–330
Ricciardi E, Bonino D, Gentili C, Sani L, Pietrini P, Vecchi T (2006) Neural correlates of spatial working memory in humans: a functional magnetic resonance imaging study comparing visual and tactile processes. Neuroscience 139:339–349
Robbins H, Monro S (1951) A stochastic approximation method. Ann Math Stat 22:400–407
Rosas P, Wagemans J, Ernst MO, Wichmann FA (2005) Texture and haptic cues in slant discrimination: reliability-based cue weighting without statistically optimal cue combination. J Opt Soc Am A Opt Image Sci Vis 22:801–809
Sambo CF, Forster B (2008) An ERP investigation on visuotactile interactions in peripersonal and extrapersonal space: evidence for the spatial rule. J Cogn Neurosci 21:1550–1559
Schlack A, Sterbing-D’Angelo SJ, Hartung K, Hoffmann KP, Bremmer F (2005) Multisensory space representations in the macaque ventral intraparietal area. J Neurosci 25:4616–4625
Stein BE, Meredith MA (1993) The merging of the senses. MIT Press, Cambridge
Treutwein B (1995) Adaptive psychophysical procedures. Vision Res 35:2503–2522
van Beers RJ, Sittig AC, van der Denier Gon JJ (1996) How humans combine simultaneous proprioceptive and visual position information. Exp Brain Res 111:253–261
van Beers RJ, Sittig AC, van der Denier Gon JJ (1999) Integration of proprioceptive and visual position-information: an experimentally supported model. J Neurophysiol 81:1355–1364
van Beers RJ, Wolpert DM, Haggard P (2002) When feeling is more important than seeing in sensorimotor adaptation. Curr Biol 12:834–837
Voisin J, Lamarre I, Chapman CE (2002) Haptic discrimination of object shape in humans: contribution of cutanous and proprioceptive inputs. Exp Brain Res 145:251–260
Wichmann FA, Hill NJ (2001) The psychometric function: I. Fitting, sampling, and goodness of fit. Percept Psychophys 63:1293–1313
Wuerger SM, Hofbauer M, Meyer GF (2003) The integration of auditory and visual motion signals at threshold. Percept Psychophys 65:1188–1196
Yuille AL, Bülthoff HH (1996) Bayesian theory and psychophysics. In: Knill D, Richards W (eds) Perception as Bayesian inference. Cambridge University Press, Cambridge
Acknowledgments
This research was supported by grant Fi 1567 from the German Research Foundation (DFG) assigned to Katja Fiehler and Frank Rösler, by the research unit DFG/FOR 560 ‘Perception and Action’ and by the TransCoop-Program from the Alexander von Humboldt Foundation assigned to Katja Fiehler and Denise Y.P. Henriques. We thank Stefan Westermann and Oguz Balandi for programming the experiment and Iseult Beets for helpful comments on the manuscript.
Author information
Authors and Affiliations
Corresponding authors
Rights and permissions
About this article
Cite this article
Reuschel, J., Drewing, K., Henriques, D.Y.P. et al. Optimal integration of visual and proprioceptive movement information for the perception of trajectory geometry. Exp Brain Res 201, 853–862 (2010). https://doi.org/10.1007/s00221-009-2099-4
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00221-009-2099-4