Abstract
Numerous theories propose a key role for brain oscillations in visual perception. Most of these theories postulate that sensory information is encoded in specific oscillatory components (e.g., power or phase) of specific frequency bands. These theories are often tested with whole-brain recording methods of low spatial resolution (EEG or MEG), or depth recordings that provide a local, incomplete view of the brain. Opportunities to bridge the gap between local neural populations and whole-brain signals are rare. Here, using representational similarity analysis (RSA) in human participants we explore which MEG oscillatory components (power and phase, across various frequency bands) correspond to low or high-level visual object representations, using brain representations from fMRI, or layer-wise representations in seven recent deep neural networks (DNNs) as a template for low/high-level object representations. The results showed that around stimulus onset and offset, most transient oscillatory signals correlated with low-level brain patterns (V1). During stimulus presentation, sustained β (∼20 Hz) and γ (>60 Hz) power best correlated with V1, while oscillatory phase components correlated with IT representations. Surprisingly, this pattern of results did not always correspond to low-level or high-level DNN layer activity. In particular, sustained β band oscillatory power reflected high-level DNN layers, suggestive of a feed-back component. These results begin to bridge the gap between whole-brain oscillatory signals and object representations supported by local neuronal activations.
Significance Statement
Brain oscillations are thought to play a key role in visual perception. We asked how oscillatory signals relate to visual object representations in localized brain regions, and how these representations evolve over time in terms of their complexity. We used representational similarity analysis (RSA) between MEG oscillations (considering both phase and amplitude) and (1) fMRI signals (to assess local activations along the cortical hierarchy), or (2) feedforward deep neural network (DNN) layers (to probe the complexity of visual representations). Our results reveal a complex picture, with the successive involvement of different oscillatory components (phase, amplitude) in different frequency bands and in different brain regions during visual object recognition.
Footnotes
The authors have no competing interests.
This work was supported by the European Research Council Consolidator Grant P-CYCLES 614244, the OSCI-DEEP Agence Nationale de la Recherche Grant ANR-19-NEUC-0004, and the ANITI Grant ANR-19-PI3A-0004 (to R.V.); a NVIDIA GPU grant, Vis-Ex ANR-12-JSH2-0004, and AI-REPS ANR-18-CE37-0007-01 (to L.R.), and the Emmy Noether Grant CI241/1-1 (to R.M.C.).
This is an open-access article distributed under the terms of the Creative Commons Attribution 4.0 International license, which permits unrestricted use, distribution and reproduction in any medium provided that the original work is properly attributed.
Jump to comment: