Skip to main content

Main menu

  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Blog
    • Collections
    • Podcast
  • TOPICS
    • Cognition and Behavior
    • Development
    • Disorders of the Nervous System
    • History, Teaching and Public Awareness
    • Integrative Systems
    • Neuronal Excitability
    • Novel Tools and Methods
    • Sensory and Motor Systems
  • ALERTS
  • FOR AUTHORS
  • ABOUT
    • Overview
    • Editorial Board
    • For the Media
    • Privacy Policy
    • Contact Us
    • Feedback
  • SUBMIT

User menu

Search

  • Advanced search
eNeuro
eNeuro

Advanced Search

 

  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Blog
    • Collections
    • Podcast
  • TOPICS
    • Cognition and Behavior
    • Development
    • Disorders of the Nervous System
    • History, Teaching and Public Awareness
    • Integrative Systems
    • Neuronal Excitability
    • Novel Tools and Methods
    • Sensory and Motor Systems
  • ALERTS
  • FOR AUTHORS
  • ABOUT
    • Overview
    • Editorial Board
    • For the Media
    • Privacy Policy
    • Contact Us
    • Feedback
  • SUBMIT
PreviousNext
Research ArticleResearch Article: New Research, Cognition and Behavior

Eye Movements in Silent Visual Speech Track Unheard Acoustic Signals and Relate to Hearing Experience

Kaja Rosa Benz, Anne Hauswald, Nina Suess, Quirin Gehmacher, Gianpaolo Demarchi, Fabian Schmidt, Gudrun Herzog, Sebastian Rösch and Nathan Weisz
eNeuro 14 April 2025, 12 (4) ENEURO.0055-25.2025; https://doi.org/10.1523/ENEURO.0055-25.2025
Kaja Rosa Benz
1Centre for Cognitive Neuroscience, Department of Psychology, Paris Lodron University of Salzburg, Salzburg 5020, Austria
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Anne Hauswald
1Centre for Cognitive Neuroscience, Department of Psychology, Paris Lodron University of Salzburg, Salzburg 5020, Austria
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Nina Suess
1Centre for Cognitive Neuroscience, Department of Psychology, Paris Lodron University of Salzburg, Salzburg 5020, Austria
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Quirin Gehmacher
1Centre for Cognitive Neuroscience, Department of Psychology, Paris Lodron University of Salzburg, Salzburg 5020, Austria
2Department of Experimental Psychology, University College London, London WC1E 6BT, United Kingdom
3Wellcome Centre for Human Neuroimaging, University College London, London WC1N 3AR, United Kingdom
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Gianpaolo Demarchi
1Centre for Cognitive Neuroscience, Department of Psychology, Paris Lodron University of Salzburg, Salzburg 5020, Austria
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Gianpaolo Demarchi
Fabian Schmidt
1Centre for Cognitive Neuroscience, Department of Psychology, Paris Lodron University of Salzburg, Salzburg 5020, Austria
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Fabian Schmidt
Gudrun Herzog
4Deaf Outpatient Clinic, University Hospital Salzburg (SALK), Salzburg 5020, Austria
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Sebastian Rösch
5Clinic and Polyclinic for Otorhinolaryngology, University Hospital Regensburg, Regensburg 93053, Germany
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Nathan Weisz
1Centre for Cognitive Neuroscience, Department of Psychology, Paris Lodron University of Salzburg, Salzburg 5020, Austria
6Neuroscience Institute, Christian Doppler University Hospital, Paracelsus Medical University Salzburg, Salzburg 5020, Austria
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF
Loading

Article Figures & Data

Figures

  • Extended Data
  • Figure 1.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 1.

    A, Stimulus material: Participants observed videos of lip movements played either forward or backward. Lip movements (opening) and the corresponding unheard speech envelope were extracted as a continuous signal. B, Analysis of the hearing group: Results of coherence calculation between the speech envelope and lip movements using the selected ICA eye component. C, Coherence calculation at the strongest voxel in the primary auditory and visual cortex. Significant frequency clusters are marked in gray (N = 49; p < 0.01). See Extended Data Figures 1-2 and 1-3 for the same figure for the deaf groups and Extended Data Figure 1-1 for a sanity check of the source reconstruction. Extended Data Figures 1-4 and 1-5 provide supplemental information about the ICA eye component.

  • Figure 2.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 2.

    Role of eye movements in the cortical effects in the hearing group. A, Significant clusters of the forward versus backward comparison of the speech–brain coherence. B, Significant clusters of the forward versus backward comparison of the speech–brain coherence remain the same after controlling for eye movements. C, Coherence <1 Hz in the A1 and the V1 for coherence and partial coherence. The cluster permutation threshold was a p-value of 0.01. Error bands reflect the standard error (N = 49). The asterisks indicate levels of significance: **p < 0.01. See Extended Data Figure 2-1 for topographies on sensor and source level.

  • Figure 3.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 3.

    A, Ocular tracking: <1 Hz forward–backward tracking of the unheard speech envelope for the three hearing groups. The hearing and the DHH group (with audiovisual listening experience) show the ocular unheard speech tracking effect, which is not the case for the congenitally deaf group (see stars above the jitter plot). Furthermore, the DHH group shows increased tracking in the forward compared with the backward condition in general (see lines in black) and increased lip tracking in the forward condition (see lines in green) compared with the other groups. B, i, The main effect across all subjects showed higher forward versus backward differences in speech than in lip tracking. ii, A1: <1 Hz tracking in the auditory cortex. The hearing and the DHH groups show the unheard speech tracking effect. iii, V1: <1 Hz tracking in the visual cortex. The congenitally deaf group has higher forward versus backward tracking than the hearing group. The same voxels were selected as in Figures 1 and 2. For the whole-brain ANOVA, the cluster permutation threshold was a p-value of 0.01. Error bands reflect the standard error. The asterisks indicate levels of significance: *p < 0.05, **p < 0.01.

Extended Data

  • Figures
  • Table 1-1

    Group information. Download Table 1-1, DOCX file.

  • Figure 1-1

    Sanity check for the source reconstruction: Voxel with the strongest lip coherence. Download Figure 1-1, TIF file.

  • Figure 1-2

    Effects for the acquired DHH group A) results of Coherence calculation of speech envelope and the lip movements with the selected ICA eye component. B) Coherence calculation at the strongest voxel in iii) in the primary auditory and visual cortex. Significant clusters are marked in gray (N = 19). Download Figure 1-2, TIF file.

  • Figure 1-3

    Effects for the acquired DHH group A) results of Coherence calculation of speech envelope and the lip movements with the selected ICA eye component. B) Coherence calculation at the strongest voxel in iii) in the primary auditory and visual cortex. Significant clusters are marked in gray (N = 7). Download Figure 1-3, TIF file.

  • Figure 1-4

    Additional information about the Eye-movements between conditions. A) The Blink frequency does not differ between the forward and the backward condition. (T(144) = -0.23, p = 0.82). The blinking frequency was 0.19  Hz in the Forward condition and 0.20  Hz in the Backward condition. B) The z-scored envelope value is lower in the forward condition compared to the backward condition (T(144) = -1.89, p < 0.05.) and overall, the envelope values are below zero (average) of the z-scored envelope (T(145)= -3.43, p < 0.001). C) The power of the ICA-Eye component does not differ significantly between the conditions, but the power in the backward condition is slightly increased in low frequencies. Download Figure 1-4, TIF file.

  • Figure 1-5

    Exemplary 10  s time course of the participant with the highest ocular tracking. In the forward condition, the blinks appear to happen mostly, when the envelope is rather low. This is in line with Supp.6 B) showing lower speech envelope values while blinking in the forward condition compared to the backward condition. This participant has a blink frequency of 0.23  Hz in the forward condition and 0.25  Hz in the backward condition. Download Figure 1-5, TIF file.

  • Figure 2-1

    A) MEG sensor plot topographies of the Sensor- Speech and Sensor-Lip coherence in the forward condition, the backward condition. The difference is the backward condition subtracted from the forward condition. B) Forward solution of the topography in A). 20% of the highest voxels in coherence are presented. Download Figure 2-1, TIF file.

Back to top

In this issue

eneuro: 12 (4)
eNeuro
Vol. 12, Issue 4
April 2025
  • Table of Contents
  • Index by author
  • Masthead (PDF)
Email

Thank you for sharing this eNeuro article.

NOTE: We request your email address only to inform the recipient that it was you who recommended this article, and that it is not junk mail. We do not retain these email addresses.

Enter multiple addresses on separate lines or separate them with commas.
Eye Movements in Silent Visual Speech Track Unheard Acoustic Signals and Relate to Hearing Experience
(Your Name) has forwarded a page to you from eNeuro
(Your Name) thought you would be interested in this article in eNeuro.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Print
View Full Page PDF
Citation Tools
Eye Movements in Silent Visual Speech Track Unheard Acoustic Signals and Relate to Hearing Experience
Kaja Rosa Benz, Anne Hauswald, Nina Suess, Quirin Gehmacher, Gianpaolo Demarchi, Fabian Schmidt, Gudrun Herzog, Sebastian Rösch, Nathan Weisz
eNeuro 14 April 2025, 12 (4) ENEURO.0055-25.2025; DOI: 10.1523/ENEURO.0055-25.2025

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Respond to this article
Share
Eye Movements in Silent Visual Speech Track Unheard Acoustic Signals and Relate to Hearing Experience
Kaja Rosa Benz, Anne Hauswald, Nina Suess, Quirin Gehmacher, Gianpaolo Demarchi, Fabian Schmidt, Gudrun Herzog, Sebastian Rösch, Nathan Weisz
eNeuro 14 April 2025, 12 (4) ENEURO.0055-25.2025; DOI: 10.1523/ENEURO.0055-25.2025
Twitter logo Facebook logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • Abstract
    • Significance Statement
    • Introduction
    • Materials and Methods
    • Results
    • Discussion
    • Conclusion
    • Data Availability
    • Footnotes
    • References
    • Synthesis
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF

Keywords

  • audiovisual integration
  • eye movements
  • lip-reading
  • (ocular) unheard speech tracking

Responses to this article

Respond to this article

Jump to comment:

No eLetters have been published for this article.

Related Articles

Cited By...

More in this TOC Section

Research Article: New Research

  • Interference underlies attenuation upon relearning in sensorimotor adaptation
  • Transformed visual working memory representations in human occipitotemporal and posterior parietal cortices
  • Functional connectome correlates of laterality preferences: Insights into Hand, Foot, and Eye Dominance Across the Lifespan
Show more Research Article: New Research

Cognition and Behavior

  • Transformed visual working memory representations in human occipitotemporal and posterior parietal cortices
  • Neural Speech-Tracking During Selective Attention: A Spatially Realistic Audiovisual Study
  • Nucleus Accumbens Dopamine Encodes the Trace Period during Appetitive Pavlovian Conditioning
Show more Cognition and Behavior

Subjects

  • Cognition and Behavior
  • Home
  • Alerts
  • Follow SFN on BlueSky
  • Visit Society for Neuroscience on Facebook
  • Follow Society for Neuroscience on Twitter
  • Follow Society for Neuroscience on LinkedIn
  • Visit Society for Neuroscience on Youtube
  • Follow our RSS feeds

Content

  • Early Release
  • Current Issue
  • Latest Articles
  • Issue Archive
  • Blog
  • Browse by Topic

Information

  • For Authors
  • For the Media

About

  • About the Journal
  • Editorial Board
  • Privacy Notice
  • Contact
  • Feedback
(eNeuro logo)
(SfN logo)

Copyright © 2025 by the Society for Neuroscience.
eNeuro eISSN: 2373-2822

The ideas and opinions expressed in eNeuro do not necessarily reflect those of SfN or the eNeuro Editorial Board. Publication of an advertisement or other product mention in eNeuro should not be construed as an endorsement of the manufacturer’s claims. SfN does not assume any responsibility for any injury and/or damage to persons or property arising from or related to any use of any material contained in eNeuro.