PT - JOURNAL ARTICLE AU - Benz, Kaja Rosa AU - Hauswald, Anne AU - Suess, Nina AU - Gehmacher, Quirin AU - Demarchi, Gianpaolo AU - Schmidt, Fabian AU - Herzog, Gudrun AU - Rösch, Sebastian AU - Weisz, Nathan TI - Eye Movements in Silent Visual Speech Track Unheard Acoustic Signals and Relate to Hearing Experience AID - 10.1523/ENEURO.0055-25.2025 DP - 2025 Apr 01 TA - eneuro PG - ENEURO.0055-25.2025 VI - 12 IP - 4 4099 - http://www.eneuro.org/content/12/4/ENEURO.0055-25.2025.short 4100 - http://www.eneuro.org/content/12/4/ENEURO.0055-25.2025.full SO - eNeuro2025 Apr 01; 12 AB - Behavioral and neuroscientific studies have shown that watching a speaker's lip movements aids speech comprehension. Intriguingly, even when videos of speakers are presented silently, various cortical regions track auditory features, such as the envelope. Recently, we demonstrated that eye movements track low-level acoustic information when attentively listening to speech. In this study, we investigated whether ocular speech tracking occurs during visual speech and how it influences cortical silent speech tracking. Furthermore, we compared data from hearing individuals, congenitally deaf individuals, and those who became deaf or hard of hearing (DHH) later in life to assess how audiovisual listening experience and auditory deprivation (early vs late onset) affect neural and ocular speech tracking during silent lip-reading. Using magnetoencephalography (MEG), we examined ocular and neural speech tracking of 75 participants observing silent videos of a speaker played forward and backward. Our main finding is a clear ocular unheard speech tracking effect with dominance of <1 Hz, which was not present for lip movements. Similarly, we observed an ≤ 1.3 Hz effect of neural unheard speech tracking in temporal regions for hearing participants. Importantly, neural tracking was not directly linked to ocular tracking. Strikingly, across listening groups, deaf participants with auditory experience showed higher ocular speech tracking than hearing participants, while no ocular speech tracking effect was revealed for congenitally deaf participants in a very small sample. This study extends previous work by demonstrating the involvement of eye movements in speech processing, even in the absence of acoustic input.