Skip to main content

Main menu

  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Blog
    • Collections
    • Podcast
  • TOPICS
    • Cognition and Behavior
    • Development
    • Disorders of the Nervous System
    • History, Teaching and Public Awareness
    • Integrative Systems
    • Neuronal Excitability
    • Novel Tools and Methods
    • Sensory and Motor Systems
  • ALERTS
  • FOR AUTHORS
  • ABOUT
    • Overview
    • Editorial Board
    • For the Media
    • Privacy Policy
    • Contact Us
    • Feedback
  • SUBMIT

User menu

Search

  • Advanced search
eNeuro
eNeuro

Advanced Search

 

  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Blog
    • Collections
    • Podcast
  • TOPICS
    • Cognition and Behavior
    • Development
    • Disorders of the Nervous System
    • History, Teaching and Public Awareness
    • Integrative Systems
    • Neuronal Excitability
    • Novel Tools and Methods
    • Sensory and Motor Systems
  • ALERTS
  • FOR AUTHORS
  • ABOUT
    • Overview
    • Editorial Board
    • For the Media
    • Privacy Policy
    • Contact Us
    • Feedback
  • SUBMIT
PreviousNext
Research ArticleResearch Article: New Research, Cognition and Behavior

Frontal, Parietal, and Temporal Brain Areas Are Differentially Activated When Disambiguating Potential Objects of Joint Attention

P.M. Kraemer, M. Görner, H. Ramezanpour, P.W. Dicke and P. Thier
eNeuro 9 September 2020, 7 (5) ENEURO.0437-19.2020; https://doi.org/10.1523/ENEURO.0437-19.2020
P.M. Kraemer
1Department of Cognitive Neurology, Hertie Institute for Clinical Brain Research, University of Tübingen, Tübingen 72076, Germany
5Department of Psychology, University of Basel, Basel 4055, Switzerland
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for P.M. Kraemer
M. Görner
1Department of Cognitive Neurology, Hertie Institute for Clinical Brain Research, University of Tübingen, Tübingen 72076, Germany
2Graduate School of Neural and Behavioural Sciences, University of Tübingen, Tübingen 72074, Germany
3International Max Planck Research School for Cognitive and Systems Neuroscience, University of Tübingen, Tübingen 72074, Germany
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for M. Görner
H. Ramezanpour
1Department of Cognitive Neurology, Hertie Institute for Clinical Brain Research, University of Tübingen, Tübingen 72076, Germany
2Graduate School of Neural and Behavioural Sciences, University of Tübingen, Tübingen 72074, Germany
3International Max Planck Research School for Cognitive and Systems Neuroscience, University of Tübingen, Tübingen 72074, Germany
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for H. Ramezanpour
P.W. Dicke
1Department of Cognitive Neurology, Hertie Institute for Clinical Brain Research, University of Tübingen, Tübingen 72076, Germany
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
P. Thier
1Department of Cognitive Neurology, Hertie Institute for Clinical Brain Research, University of Tübingen, Tübingen 72076, Germany
4Werner Reichardt Centre for Integrative Neuroscience, University of Tübingen, Tübingen 72076, Germany
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for P. Thier
  • Article
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF
Loading

Article Figures & Data

Figures

  • Extended Data
  • Figure 1.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 1.

    Contextual gaze following task. An avatar appeared in the center of the screen together with four linearly arranged sets of objects (houses and hands). After a baseline fixation period, the portrait’s gaze shifted toward one specific target object simultaneously with an auditory contextual instruction specifying the object class of the target (“hand” or “house”) or not, i.e., remaining uninformative (“none”). While maintaining fixation, subjects needed to decide on the target and make a saccade to the chosen target after a go-signal which was indicated by the disappearance of the fixation dot.

  • Figure 2.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 2.

    Behavioral performance. Left, Boxplots (black and gray) showing the percentage of correct response in the localizer paradigm (dashed line depicts chance level performance, see Extended Data Fig. 2-1 for a description of the localizer paradigm). Right, Plots of correct responses in the contextual gaze following paradigm (weighted mean performance and weighted SD, dashed lines depict expected performance; blue: unambiguous, yellow: ambiguous-informative, salmon: ambiguous-uninformative).

  • Figure 3.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 3.

    Activation maps emphasizing the ROIs. Left column, Contrast gaze following > color matching (localizer paradigm) used to identify the GFP. Blue dots mark maximum activation on the group level closest to locations taken from literature [green dots (Marquardt et al., 2017) and cyan dots (Materna et al., 2008)], white dots mark the maximum activation of those locations which were identifiable on the individual level. Middle column, Contrast gaze following > baseline fixation (localizer paradigm) used to identify saccade-related activity in the hLIP closest to location taken from (cyan dot; Sereno et al., 2001). Blue and white dots mark again, group level and individual coordinates. Right column, ambiguous-uninformative > unambiguous (contextual gaze following paradigm). Blue and white dots mark the group level and individual locations of the maximum IFJ-activity. See Extended Data Figure 3-1 for tabular form of all activated regions of each contrast and Extended Data Figures 3-2, 3-3, 3-4 for the respective contrast maps.

  • Figure 4.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 4.

    Time courses of activation in the GFP and the hLIP. Time course of mean percent signal change in the contextual gaze following experiment in areas identified in the localizer experiment (error bars are SEM). Areas in which conditions showed significant differences are shaded (permutations test, q < 0.05). Vertical dashed lines represent the within trial events cue onset, go-signal, and blank fixation onset. See Figure 6 for the model-based analysis.

  • Figure 5.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 5.

    Time courses of activation in the IFJ. Time course of mean percent signal change during the contextual gaze following experiment of the IFJ (error bars are SEM). Areas in which conditions showed significant differences are shaded (permutations test, q < 0.05). Vertical dashed lines represent the within trial events cue onset, go-signal, and blank fixation onset. See Figure 6 for the model-based analysis.

  • Figure 6.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 6.

    Model-based analyses of time courses during the contextual gaze following experiment. Lines depict the posterior means of BOLD activity. Shaded areas comprise 95% Bayesian credible intervals. Vertical dashed lines represent the within trial events cue onset, go-signal, and blank fixation onset.

  • Figure 7.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 7.

    Group level result of the searchlight decoding analysis (t-map of classification accuracies, p < 0.001). Extended Data Figure 7-1 shows the distributions of individual accuracies for the ROIs used in the time course analysis.

Extended Data

  • Figures
  • Extended Data Figure 2-1

    Description of the localizer paradigm. Download Figure 2-1, DOC file.

  • Extended Data Figure 3-1

    List of activated brain areas. Download Figure 3-1, DOC file.

  • Extended Data Figure 3-2

    Contrast map gaze following > color matching (p < 0.001, cluster size > 6 voxels). Download Figure 3-2, EPS file.

  • Extended Data Figure 3-3

    Contrast map gaze following > baseline (p < 0.001, cluster size > 6 voxels). Download Figure 3-3, EPS file.

  • Extended Data Figure 3-4

    Contrast map ambiguous-uninformative > unambiguous (p < 0.001, cluster size > 6 voxels). Download Figure 3-4, EPS file.

  • Extended Data Figure 7-1

    Classification accuracy distributions across participants for the ROIs used in the time course analysis. Asterisks marking distributions significantly different from a classifier performing at chance level (Wilcoxon signed-rank test, p < 0.001). Download Figure 7-1, EPS file.

Back to top

In this issue

eneuro: 7 (5)
eNeuro
Vol. 7, Issue 5
September/October 2020
  • Table of Contents
  • Index by author
Email

Thank you for sharing this eNeuro article.

NOTE: We request your email address only to inform the recipient that it was you who recommended this article, and that it is not junk mail. We do not retain these email addresses.

Enter multiple addresses on separate lines or separate them with commas.
Frontal, Parietal, and Temporal Brain Areas Are Differentially Activated When Disambiguating Potential Objects of Joint Attention
(Your Name) has forwarded a page to you from eNeuro
(Your Name) thought you would be interested in this article in eNeuro.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Print
View Full Page PDF
Citation Tools
Frontal, Parietal, and Temporal Brain Areas Are Differentially Activated When Disambiguating Potential Objects of Joint Attention
P.M. Kraemer, M. Görner, H. Ramezanpour, P.W. Dicke, P. Thier
eNeuro 9 September 2020, 7 (5) ENEURO.0437-19.2020; DOI: 10.1523/ENEURO.0437-19.2020

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Respond to this article
Share
Frontal, Parietal, and Temporal Brain Areas Are Differentially Activated When Disambiguating Potential Objects of Joint Attention
P.M. Kraemer, M. Görner, H. Ramezanpour, P.W. Dicke, P. Thier
eNeuro 9 September 2020, 7 (5) ENEURO.0437-19.2020; DOI: 10.1523/ENEURO.0437-19.2020
Twitter logo Facebook logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • Abstract
    • Significance Statement
    • Introduction
    • Materials and Methods
    • Results
    • Discussion
    • Acknowledgments
    • Footnotes
    • References
    • Synthesis
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF

Keywords

  • fMRI
  • gaze following
  • human lateral intraparietal area
  • inferior frontal junction
  • joint attention
  • superior temporal sulcus

Responses to this article

Respond to this article

Jump to comment:

No eLetters have been published for this article.

Related Articles

Cited By...

More in this TOC Section

Research Article: New Research

  • Robust representation and nonlinear spectral integration of harmonic stacks in layer 4 of mouse primary auditory cortex
  • Changes in palatability processing across the estrous cycle are modulated by hypothalamic estradiol signaling
  • Automatic, but not autonomous: Implicit adaptation is modulated by goal-directed attentional demands
Show more Research Article: New Research

Cognition and Behavior

  • Dynamic encoding of reward prediction error signals in the pigeon ventral tegmental area during reinforcement learning
  • Transcranial Static Magnetic Stimulation Dissociates the Causal Roles of the Parietal Cortex in Spatial and Temporal Processing
  • CRF receptor type 1 modulates the nigrostriatal dopamine projection and facilitates cognitive flexibility after acute and chronic stress
Show more Cognition and Behavior

Subjects

  • Cognition and Behavior
  • Home
  • Alerts
  • Follow SFN on BlueSky
  • Visit Society for Neuroscience on Facebook
  • Follow Society for Neuroscience on Twitter
  • Follow Society for Neuroscience on LinkedIn
  • Visit Society for Neuroscience on Youtube
  • Follow our RSS feeds

Content

  • Early Release
  • Current Issue
  • Latest Articles
  • Issue Archive
  • Blog
  • Browse by Topic

Information

  • For Authors
  • For the Media

About

  • About the Journal
  • Editorial Board
  • Privacy Notice
  • Contact
  • Feedback
(eNeuro logo)
(SfN logo)

Copyright © 2026 by the Society for Neuroscience.
eNeuro eISSN: 2373-2822

The ideas and opinions expressed in eNeuro do not necessarily reflect those of SfN or the eNeuro Editorial Board. Publication of an advertisement or other product mention in eNeuro should not be construed as an endorsement of the manufacturer’s claims. SfN does not assume any responsibility for any injury and/or damage to persons or property arising from or related to any use of any material contained in eNeuro.