Skip to main content

Main menu

  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Blog
    • Collections
    • Podcast
  • TOPICS
    • Cognition and Behavior
    • Development
    • Disorders of the Nervous System
    • History, Teaching and Public Awareness
    • Integrative Systems
    • Neuronal Excitability
    • Novel Tools and Methods
    • Sensory and Motor Systems
  • ALERTS
  • FOR AUTHORS
  • ABOUT
    • Overview
    • Editorial Board
    • For the Media
    • Privacy Policy
    • Contact Us
    • Feedback
  • SUBMIT

User menu

Search

  • Advanced search
eNeuro
eNeuro

Advanced Search

 

  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Blog
    • Collections
    • Podcast
  • TOPICS
    • Cognition and Behavior
    • Development
    • Disorders of the Nervous System
    • History, Teaching and Public Awareness
    • Integrative Systems
    • Neuronal Excitability
    • Novel Tools and Methods
    • Sensory and Motor Systems
  • ALERTS
  • FOR AUTHORS
  • ABOUT
    • Overview
    • Editorial Board
    • For the Media
    • Privacy Policy
    • Contact Us
    • Feedback
  • SUBMIT
PreviousNext
Research ArticleResearch Article: New Research, Sensory and Motor Systems

A Somatosensory Computation That Unifies Limbs and Tools

Luke E. Miller, Cécile Fabio, Frédérique de Vignemont, Alice Roy, W. Pieter Medendorp and Alessandro Farnè
eNeuro 17 October 2023, 10 (11) ENEURO.0095-23.2023; https://doi.org/10.1523/ENEURO.0095-23.2023
Luke E. Miller
1Integrative Multisensory Perception Action and Cognition Team-ImpAct, Lyon Neuroscience Research Center, Institut National de la Santé et de la Recherche Médicale Unité 1028, Centre National de la Recherche Scientifique Unité 5292, 69500 Bron, France
2UCBL, University of Lyon 1, 69100 Villeurbanne, France
3Neuro-immersion, Hospices Civils de Lyon, 69500 Bron, France
4Donders Institute for Brain, Cognition and Behaviour, Radboud University, 6525 GD, Nijmegen, The Netherlands
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Luke E. Miller
Cécile Fabio
1Integrative Multisensory Perception Action and Cognition Team-ImpAct, Lyon Neuroscience Research Center, Institut National de la Santé et de la Recherche Médicale Unité 1028, Centre National de la Recherche Scientifique Unité 5292, 69500 Bron, France
2UCBL, University of Lyon 1, 69100 Villeurbanne, France
3Neuro-immersion, Hospices Civils de Lyon, 69500 Bron, France
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Frédérique de Vignemont
7Institut Jean Nicod, Department of Cognitive Studies, Ecole Normale Superieure, Paris Sciences et Lettres University, 75005 Paris, France
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Alice Roy
6Laboratoire Dynamique du Langage, Centre National de la Recherche Scientifique, Unité Mixte de Recherche 5596, 69007 Lyon, France
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
W. Pieter Medendorp
4Donders Institute for Brain, Cognition and Behaviour, Radboud University, 6525 GD, Nijmegen, The Netherlands
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for W. Pieter Medendorp
Alessandro Farnè
1Integrative Multisensory Perception Action and Cognition Team-ImpAct, Lyon Neuroscience Research Center, Institut National de la Santé et de la Recherche Médicale Unité 1028, Centre National de la Recherche Scientifique Unité 5292, 69500 Bron, France
2UCBL, University of Lyon 1, 69100 Villeurbanne, France
3Neuro-immersion, Hospices Civils de Lyon, 69500 Bron, France
5Center for Mind/Brain Sciences, University of Trento, 38068 Rovereto, Italy
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF
Loading

Article Figures & Data

Figures

  • Tables
  • Extended Data
  • Figure 1.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 1.

    Model of trilateration and tool-sensing paradigm. A, The trilateral computation applied to the space of the arm (bottom) a hand-held rod (top). Distance estimates from sensory input (black) and each boundary (d1 and d2) are integrated (purple) to form a location estimate. B, In our model, the noise in each distance estimate (d1, d2) increases linearly with distance. The integrated estimate forms an inverted U-shaped pattern. C, Two tool-sensing tasks used to characterize tactile localization on a hand-held rod. The purple arrow corresponds to the location of touch in tool-centered space. The red square corresponds to the judgment of location within the computer screen.

  • Figure 2.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 2.

    Vibration modes and feature space. A, The shape of the first five modes ω for contact on a cantilever rod. The weight of each mode varies as a function of hit location. Each hit location is characterized by a unique combination of mode weights. B, The vibration-location feature space (purple) from handle (X1) to tip (X2). This feature space is isomorphic with the actual physical space of the rod. ω corresponds to a resonant frequency, the black dot corresponds to the hit location (as in Fig. 1A) within the feature space, and the arrows are the gradients of distance estimation during trilateration.

  • Figure 3.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 3.

    Neural network implementation of trilateration. A, Neural network implementation of trilateration: (lower panel) the Mode layer is composed of subpopulations (two shown here) sensitive to the weight of individual modes (Fig. 2A), which are location-dependent; (middle panel). Feature layer takes input from the mode layer and encodes the feature space (Fig. 2B), which forms the isomorphism with the physical space of the tool; (upper panel) the Distance layer is composed of two subpopulations of neurons with distance-dependent gradients in tuning properties (shown: firing rate and tuning width). The distance of a tuning curve from its “anchor” is coded by the luminance, with darker colors corresponding to neurons that are closer to the spatial boundary. B, Activations for each layer of the network averaged over 5000 simulations when touch was at 0.75 (space between 0 and 1). Each dot corresponds to a unit of the neural network. (lower panel) mode layer, with three of five subpopulations shown; (middle panel) feature layer; (upper panel) distance layer of localization for each decoding subpopulation.

  • Figure 4.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 4.

    Localization and variable error for both tasks. A, Regressions fit to the localization judgments for both the image-based (blue) and space-based (orange) tasks. Error bars correspond to the group-level 95% confidence interval. B, Group-level variable errors for both tasks. Error bars correspond to the group-level 95% confidence interval.

  • Figure 5.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 5.

    Trilateration model provides a good fit to localization behavior. A, Fit of the trilateration model to the group-level variable error (black dots). The purple line corresponds to the model fit. The light gray line and squares correspond to variable errors for localization on the arm observed in Miller et al. (2022); note that these data are size adjusted to account for differences in arm and rod size. B, Fit of the trilateration model to the variable errors of six randomly chosen participants. The fit of the trilateration model for each participant’s behavior can be seen in Extended Data Figures 5-1 and 5-2.

  • Figure 6.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 6.

    Trilateration provides a better fit to the data than boundary truncation. A, Participant-level goodness of fits (R2) for the trilateration model (left, purple) and the boundary truncation model (right, green). For each participant, trilateration was a better fit to the data. B, Histogram of the ΔBIC values used to adjudicate between the two models, color-coded by the strength of the evidence in favor of trilateration. Purple corresponds to substantial evidence in favor of trilateration; pink corresponds to moderate evidence in favor of trilateration; gray corresponds to weak/equivocal evidence in favor of trilateration. Note that in no case did the boundary truncation model provide a better fit to the localization data (i.e., ΔBIC < 0).

  • Figure 7.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 7.

    Neural network simulations. A, Localization accuracy for the estimates of each decoding subpopulation (upper panel; L1, blue; L2, red) and after integration by the Bayesian decoder (lower panel; LINT, purple). B, Decoding noise for each decoding subpopulation (upper panel) increased as a function of distance from each landmark. Note that distance estimates are made from the 10% and 90% locations for the first (blue) and second (red) decoding subpopulations, respectively. Integration via the Bayesian decoder (lower panel) led to an inverted U-shaped pattern across the surface. Note the differences in the y-axis range for both panels. The results of decoding for the mode and feature space layers of the network can be seen in Extended Data Figure 7-1.

  • Figure 8.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 8.

    Simulations of multisegmented rods. We simulated how trilateration operates within rods with different numbers of segments. Here, we show the predicted patterns of variability for (A) a single-segment rod (used in present study) and (B) two-segment (left) and three-segment (right) rods. The magnitude of variable error is color-coded as red-to-blue (low-to-high). The inverted U-shaped pattern of variability was observed in each segment.

Tables

  • Figures
  • Extended Data
    • View popup
    Table 1

    Neural network parameter values

    fM fS fD1 fD2
    μ−1.5:0.02:1.5−40:1:1400:1:140−40:1:100
    κ or κ0 25252525
    σ or σ0 0.083.403.403.40
    β——0.010.01
    γ——0.50.5

Extended Data

  • Figures
  • Tables
  • Extended Data Figure 5-1

    Trilateration model fits for participants S1–S19

    Fit of the trilateration model to the variable error (black dots) of participants S1–S19 (top-to-bottom; left-to-right). The purple line corresponds to the model fit. The goodness of fit is displayed as the R2. Download Figure 5-1, EPS file.

  • Extended Data Figure 5-2

    Trilateration model fits for participants S20–S38

    Fit of the trilateration model to the variable error (black dots) of participants S20–S38 (top-to-bottom; left-to-right). The purple line corresponds to the model fit. The goodness of fit is displayed as the R2. Download Figure 5-2, EPS file.

  • Extended Data Figure 7-1

    Intermediate output of the Mode and Feature layers

    (A) Localization accuracy for the sensory estimates decoded from Mode (top panel) and Feature layers (bottom panel). Note that the ‘location’ decoded here is best conceptualized as within the vibratory feature space, as spatial localization is done via trilateration at higher layers of the network (B) Uncertainty in the sensory estimates decoded from the Mode (top panel) and the Feature layers (bottom panel). Download Figure 7-1, EPS file.

Back to top

In this issue

eneuro: 10 (11)
eNeuro
Vol. 10, Issue 11
November 2023
  • Table of Contents
  • Index by author
  • Masthead (PDF)
Email

Thank you for sharing this eNeuro article.

NOTE: We request your email address only to inform the recipient that it was you who recommended this article, and that it is not junk mail. We do not retain these email addresses.

Enter multiple addresses on separate lines or separate them with commas.
A Somatosensory Computation That Unifies Limbs and Tools
(Your Name) has forwarded a page to you from eNeuro
(Your Name) thought you would be interested in this article in eNeuro.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Print
View Full Page PDF
Citation Tools
A Somatosensory Computation That Unifies Limbs and Tools
Luke E. Miller, Cécile Fabio, Frédérique de Vignemont, Alice Roy, W. Pieter Medendorp, Alessandro Farnè
eNeuro 17 October 2023, 10 (11) ENEURO.0095-23.2023; DOI: 10.1523/ENEURO.0095-23.2023

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Respond to this article
Share
A Somatosensory Computation That Unifies Limbs and Tools
Luke E. Miller, Cécile Fabio, Frédérique de Vignemont, Alice Roy, W. Pieter Medendorp, Alessandro Farnè
eNeuro 17 October 2023, 10 (11) ENEURO.0095-23.2023; DOI: 10.1523/ENEURO.0095-23.2023
Twitter logo Facebook logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • Abstract
    • Significance Statement
    • Introduction
    • Materials and Methods
    • Results
    • Discussion
    • Footnotes
    • References
    • Synthesis
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF

Keywords

  • computation
  • embodiment
  • space
  • tactile localization
  • tool use

Responses to this article

Respond to this article

Jump to comment:

No eLetters have been published for this article.

Related Articles

Cited By...

More in this TOC Section

Research Article: New Research

  • Fast spiking interneurons autonomously generate fast gamma oscillations in the medial entorhinal cortex with excitation strength tuning ING–PING transitions
  • The serotonin 1B receptor modulates striatal activity differentially based on behavioral context
  • Population-level age effects on the white matter structure subserving cognitive flexibility in the human brain
Show more Research Article: New Research

Sensory and Motor Systems

  • Fast spiking interneurons autonomously generate fast gamma oscillations in the medial entorhinal cortex with excitation strength tuning ING–PING transitions
  • The serotonin 1B receptor modulates striatal activity differentially based on behavioral context
  • Population-level age effects on the white matter structure subserving cognitive flexibility in the human brain
Show more Sensory and Motor Systems

Subjects

  • Sensory and Motor Systems
  • Home
  • Alerts
  • Follow SFN on BlueSky
  • Visit Society for Neuroscience on Facebook
  • Follow Society for Neuroscience on Twitter
  • Follow Society for Neuroscience on LinkedIn
  • Visit Society for Neuroscience on Youtube
  • Follow our RSS feeds

Content

  • Early Release
  • Current Issue
  • Latest Articles
  • Issue Archive
  • Blog
  • Browse by Topic

Information

  • For Authors
  • For the Media

About

  • About the Journal
  • Editorial Board
  • Privacy Notice
  • Contact
  • Feedback
(eNeuro logo)
(SfN logo)

Copyright © 2026 by the Society for Neuroscience.
eNeuro eISSN: 2373-2822

The ideas and opinions expressed in eNeuro do not necessarily reflect those of SfN or the eNeuro Editorial Board. Publication of an advertisement or other product mention in eNeuro should not be construed as an endorsement of the manufacturer’s claims. SfN does not assume any responsibility for any injury and/or damage to persons or property arising from or related to any use of any material contained in eNeuro.