RT Journal Article SR Electronic T1 Eye Movements in Silent Visual Speech Track Unheard Acoustic Signals and Relate to Hearing Experience JF eneuro JO eNeuro FD Society for Neuroscience SP ENEURO.0055-25.2025 DO 10.1523/ENEURO.0055-25.2025 VO 12 IS 4 A1 Benz, Kaja Rosa A1 Hauswald, Anne A1 Suess, Nina A1 Gehmacher, Quirin A1 Demarchi, Gianpaolo A1 Schmidt, Fabian A1 Herzog, Gudrun A1 Rösch, Sebastian A1 Weisz, Nathan YR 2025 UL http://www.eneuro.org/content/12/4/ENEURO.0055-25.2025.abstract AB Behavioral and neuroscientific studies have shown that watching a speaker's lip movements aids speech comprehension. Intriguingly, even when videos of speakers are presented silently, various cortical regions track auditory features, such as the envelope. Recently, we demonstrated that eye movements track low-level acoustic information when attentively listening to speech. In this study, we investigated whether ocular speech tracking occurs during visual speech and how it influences cortical silent speech tracking. Furthermore, we compared data from hearing individuals, congenitally deaf individuals, and those who became deaf or hard of hearing (DHH) later in life to assess how audiovisual listening experience and auditory deprivation (early vs late onset) affect neural and ocular speech tracking during silent lip-reading. Using magnetoencephalography (MEG), we examined ocular and neural speech tracking of 75 participants observing silent videos of a speaker played forward and backward. Our main finding is a clear ocular unheard speech tracking effect with dominance of <1 Hz, which was not present for lip movements. Similarly, we observed an ≤ 1.3 Hz effect of neural unheard speech tracking in temporal regions for hearing participants. Importantly, neural tracking was not directly linked to ocular tracking. Strikingly, across listening groups, deaf participants with auditory experience showed higher ocular speech tracking than hearing participants, while no ocular speech tracking effect was revealed for congenitally deaf participants in a very small sample. This study extends previous work by demonstrating the involvement of eye movements in speech processing, even in the absence of acoustic input.