Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • Article
  • Published:

Single-trial neural dynamics are dominated by richly varied movements

Abstract

When experts are immersed in a task, do their brains prioritize task-related activity? Most efforts to understand neural activity during well-learned tasks focus on cognitive computations and task-related movements. We wondered whether task-performing animals explore a broader movement landscape and how this impacts neural activity. We characterized movements using video and other sensors and measured neural activity using widefield and two-photon imaging. Cortex-wide activity was dominated by movements, especially uninstructed movements not required for the task. Some uninstructed movements were aligned to trial events. Accounting for them revealed that neurons with similar trial-averaged activity often reflected utterly different combinations of cognitive and movement variables. Other movements occurred idiosyncratically, accounting for trial-by-trial fluctuations that are often considered ‘noise’. This held true throughout task-learning and for extracellular Neuropixels recordings that included subcortical areas. Our observations argue that animals execute expert decisions while performing richly varied, uninstructed movements that profoundly shape neural activity.

This is a preview of subscription content, access via your institution

Access options

Rent or buy this article

Prices vary by article type

from$1.95

to$39.95

Prices may be subject to local taxes which are calculated during checkout

Fig. 1: Widefield calcium imaging during auditory and visual decision-making.
Fig. 2: A linear model to reveal behavioral correlates of cortical activity.
Fig. 3: Linear model predicts cortical activity with highly specific model variables.
Fig. 4: Uninstructed movements dominate cortical activity.
Fig. 5: Uninstructed movements make considerable task-aligned and task-independent contributions.
Fig. 6: Cognitive and movement responses during learning.
Fig. 7: Movements are important for interpreting single-neuron data.
Fig. 8: Uninstructed movements predict single-neuron activity in cortical and subcortical areas.

Similar content being viewed by others

Data availability

The data from this study will be stored on a dedicated, backed-up repository maintained by Cold Spring Harbor Laboratory. A link to the repository can be found at http://churchlandlab.labsites.cshl.edu/code/.

Code availability

The MATLAB code used for the data analysis in this study is available online at http://churchlandlab.labsites.cshl.edu/code/.

References

  1. Shadlen, M. N. & Newsome, W. T. Motion perception: seeing and deciding. Proc. Natl Acad. Sci. USA 93, 628–633 (1996).

    Article  CAS  Google Scholar 

  2. Maimon, G. & Assad, J. A. A cognitive signal for the proactive timing of action in macaque LIP. Nat. Neurosci. 9, 948–955 (2006).

    Article  CAS  Google Scholar 

  3. Horwitz, G. D., Batista, A. P. & Newsome, W. T. Representation of an abstract perceptual decision in macaque superior colliculus. J. Neurophysiol. 91, 2281–2296 (2004).

    Article  Google Scholar 

  4. Gold, J. I. & Shadlen, M. N. The influence of behavioral context on the representation of a perceptual decision in developing oculomotor commands. J. Neurosci. 23, 632–651 (2003).

    Article  CAS  Google Scholar 

  5. Roitman, J. D. & Shadlen, M. N. Response of neurons in the lateral intraparietal area during a combined visual discrimination reaction time task. J. Neurosci. 22, 9475–9489 (2002).

    Article  CAS  Google Scholar 

  6. Churchland, A. K., Kiani, R. & Shadlen, M. N. Decision-making with multiple alternatives. Nat. Neurosci. 11, 693–702 (2008).

    Article  CAS  Google Scholar 

  7. Erlich, J. C., Bialek, M. & Brody, C. D. A cortical substrate for memory-guided orienting in the rat. Neuron 72, 330–343 (2011).

    Article  CAS  Google Scholar 

  8. Niell, C. M. & Stryker, M. P. Modulation of visual responses by behavioral state in mouse visual cortex. Neuron 65, 472–479 (2010).

    Article  CAS  Google Scholar 

  9. Saleem, A. B., Ayaz, A., Jeffery, K. J., Harris, K. D. & Carandini, M. Integration of visual motion and locomotion in mouse visual cortex. Nat. Neurosci. 16, 1864–1869 (2013).

    Article  CAS  Google Scholar 

  10. Wekselblatt, J. B., Flister, E. D., Piscopo, D. M. & Niell, C. M. Large-scale imaging of cortical dynamics during sensory perception and behavior. J. Neurophysiol. 115, 2852–2866 (2016).

    Article  CAS  Google Scholar 

  11. Guo, Z. V. et al. Flow of cortical activity underlying a tactile decision in mice. Neuron 81, 179–194 (2014).

    Article  CAS  Google Scholar 

  12. Allen, W. E. et al. Global representations of goal-directed behavior in distinct cell types of mouse neocortex. Neuron 94, 891–907.e6 (2017).

    Article  CAS  Google Scholar 

  13. Runyan, C. A., Piasini, E., Panzeri, S. & Harvey, C. D. Distinct timescales of population coding across cortex. Nature 548, 92–96 (2017).

    Article  CAS  Google Scholar 

  14. Scott, B. B. et al. Fronto-parietal cortical circuits encode accumulated evidence with a diversity of timescales. Neuron 95, 385–398.e5 (2017).

    Article  CAS  Google Scholar 

  15. Caballero-Gaudes, C. & Reynolds, R. C. Methods for cleaning the BOLD fMRI signal. NeuroImage 154, 128–149 (2017).

    Article  Google Scholar 

  16. Stringer, C. et al. Spontaneous behaviors drive multidimensional, brainwide activity. Science 364, eaav7893 (2019).

    Article  Google Scholar 

  17. Shimaoka, D., Harris, K. D. & Carandini, M. Effects of arousal on mouse sensory cortex depend on modality. Cell Rep. 22, 3160–3167 (2018).

    Article  CAS  Google Scholar 

  18. Hanks, T. D. et al. Distinct relationships of parietal and prefrontal cortices to evidence accumulation. Nature 520, 220–223 (2015).

    Article  CAS  Google Scholar 

  19. Erlich, J. C., Brunton, B. W., Duan, C. A., Hanks, T. D. & Brody, C. D. Distinct effects of prefrontal and parietal cortex inactivations on an accumulation of evidence task in the rat. eLife Sci. 4, e05457 (2015).

    Article  Google Scholar 

  20. Steinmetz, N. A. et al. Aberrant cortical activity in multiple GCaMP6-expressing transgenic mouse lines. eNeuro 4, ENEURO.0207-17.2017 (2017).

    Article  Google Scholar 

  21. Zhuang, J. et al. An extended retinotopic map of mouse cortex. eLife 6, e18372 (2017).

    Article  Google Scholar 

  22. Chen, T.-W., Li, N., Daie, K. & Svoboda, K. A map of anticipatory activity in mouse motor cortex. Neuron 94, 866–879.e4 (2017).

    Article  CAS  Google Scholar 

  23. Garrett, M. E., Nauhaus, I., Marshel, J. H. & Callaway, E. M. Topography and areal organization of mouse visual cortex. J. Neurosci. 34, 12587–12600 (2014).

    Article  CAS  Google Scholar 

  24. Guo, Z. V. et al. Maintenance of persistent activity in a frontal thalamocortical loop. Nature 545, 181–186 (2017).

    Article  CAS  Google Scholar 

  25. Li, N., Chen, T.-W., Guo, Z. V., Gerfen, C. R. & Svoboda, K. A motor cortex circuit for motor planning and movement. Nature 519, 51–56 (2015).

    Article  CAS  Google Scholar 

  26. Salkoff, D. B., Zagha, E., McCarthy, E., McCormick, D.A. Movement and performance predict widespread cortical activity in a visual detection task. Cereb. Cortex (in the press).

  27. Raposo, D., Kaufman, M. T. & Churchland, A. K. A category-free neural population supports evolving demands during decision-making. Nat. Neurosci. 17, 1784–1792 (2014).

    Article  CAS  Google Scholar 

  28. Horwitz, G. D. & Newsome, W. T. Separate signals for target selection and movement specification in the superior colliculus. Science 284, 1158–1161 (1999).

    Article  CAS  Google Scholar 

  29. Jun, J. J. et al. Fully integrated silicon probes for high-density recording of neural activity. Nature 551, 232–236 (2017).

    Article  CAS  Google Scholar 

  30. Vinck, M., Batista-Brito, R., Knoblich, U. & Cardin, J. A. Arousal and locomotion make distinct contributions to cortical activity patterns and visual encoding. Neuron 86, 740–754 (2015).

    Article  CAS  Google Scholar 

  31. Polack, P.-O., Friedman, J. & Golshani, P. Cellular mechanisms of brain-state-dependent gain modulation in visual cortex. Nat. Neurosci. 16, 1331–1339 (2013).

    Article  CAS  Google Scholar 

  32. Reimer, J. et al. Pupil fluctuations track fast switching of cortical states during quiet wakefulness. Neuron 84, 355–362 (2014).

    Article  CAS  Google Scholar 

  33. Pereira, T. D. et al. Fast animal pose estimation using deep neural networks. Nat. Methods 16, 117 (2019).

    Article  CAS  Google Scholar 

  34. Mathis, A. et al. DeepLabCut: markerless pose estimation of user-defined body parts with deep learning. Nat. Neurosci. 21, 1281 (2018).

    Article  CAS  Google Scholar 

  35. Le Merre, P. et al. Reward-based learning drives rapid sensory signals in medial prefrontal cortex and dorsal hippocampus necessary for goal-directed behavior. Neuron 97, 83–91.e5 (2018).

    Article  Google Scholar 

  36. Gilad, A., Gallero-Salas, Y., Groos, D. & Helmchen, F. Behavioral strategy determines frontal or posterior location of short-term memory in neocortex. Neuron 99, 814–828.e7 (2018).

    Article  CAS  Google Scholar 

  37. Euston, D. R. & McNaughton, B. L. Apparent encoding of sequential context in rat medial prefrontal cortex is accounted for by behavioral variability. J. Neurosci. 26, 13143–13155 (2006).

    Article  CAS  Google Scholar 

  38. Kawai, R. et al. Motor cortex is required for learning but not for executing a motor skill. Neuron 86, 800–812 (2015).

    Article  CAS  Google Scholar 

  39. Coddington, L. T. & Dudman, J. T. The timing of action determines reward prediction signals in identified midbrain dopamine neurons. Nat. Neurosci. 21, 1563 (2018).

    Article  CAS  Google Scholar 

  40. Ayaz, A., Saleem, A. B., Schölvinck, M. L. & Carandini, M. Locomotion controls spatial integration in mouse visual cortex. Curr. Biol. 23, 890–894 (2013).

    Article  CAS  Google Scholar 

  41. Sommer, M. A. & Wurtz, R. H. Brain circuits for the internal monitoring of movements. Annu. Rev. Neurosci. 31, 317 (2008).

    Article  CAS  Google Scholar 

  42. Schmitt, L. I. et al. Thalamic amplification of cortical connectivity sustains attentional control. Nature 545, 219–223 (2017).

    Article  CAS  Google Scholar 

  43. Wang, L., Rangarajan, K. V., Gerfen, C. R. & Krauzlis, R. J. Activation of striatal neurons causes a perceptual decision bias during visual change detection in mice. Neuron 97, 1369–1381.e5 (2018).

    Article  CAS  Google Scholar 

  44. Engel, T. A., Chaisangmongkon, W., Freedman, D. J. & Wang, X.-J. Choice-correlated activity fluctuations underlie learning of neuronal category representation. Nat. Commun. 6, 6454 (2015).

    Article  CAS  Google Scholar 

  45. Keller, G. B., Bonhoeffer, T. & Hübener, M. Sensorimotor mismatch signals in primary visual cortex of the behaving mouse. Neuron 74, 809–815 (2012).

    Article  CAS  Google Scholar 

  46. Wolpert, D. M., Ghahramani, Z. & Jordan, M. I. An internal model for sensorimotor integration. Science 269, 1880–1882 (1995).

    Article  CAS  Google Scholar 

  47. Wolpert, D. M. & Miall, R. C. Forward models for physiological motor control. Neural Netw. 9, 1265–1279 (1996).

    Article  Google Scholar 

  48. Wolpert, D. M. & Kawato, M. Multiple paired forward and inverse models for motor control. Neural Netw. 11, 1317–1329 (1998).

    Article  CAS  Google Scholar 

  49. Schultz, W. Dopamine neurons and their role in reward mechanisms. Curr. Opin. Neurobiol. 7, 191–197 (1997).

    Article  CAS  Google Scholar 

  50. Wolpert, D. M., Miall, R. C. & Kawato, M. Internal models in the cerebellum. Trends Cogn. Sci. (Regul. Ed.) 2, 338–347 (1998).

    Article  CAS  Google Scholar 

  51. Juavinett, A. L., Bekheet, G. & Churchland, A. K. Chronically implanted Neuropixels probes enable high-yield recordings in freely moving mice. eLife 8, e47188 (2018).

  52. Ratzlaff, E. H. & Grinvald, A. A tandem-lens epifluorescence macroscope: hundred-fold brightness advantage for wide-field imaging. J. Neurosci. Methods 36, 127–137 (1991).

    Article  CAS  Google Scholar 

  53. Lerner, T. N. et al. Intact-brain analyses reveal distinct information carried by SNc dopamine subcircuits. Cell 162, 635–647 (2015).

    Article  CAS  Google Scholar 

  54. Pachitariu, M. et al. Suite2p: beyond 10,000 neurons with standard two-photon microscopy. Preprint at bioRxiv https://doi.org/10.1101/061507 (2016).

  55. Jia, H., Rochefort, N. L., Chen, X. & Konnerth, A. In vivo two-photon imaging of sensory-evoked dendritic calcium signals in cortical neurons. Nat. Protoc. 6, 28–35 (2011).

    Article  CAS  Google Scholar 

  56. Pachitariu, M., Steinmetz, N., Kadir, S., Carandini, M. & Harris, K. D. Fast and accurate spike sorting of high-channel count probes with KiloSort. Adv. Neural Inf. Proc. Sys. 29, 6326 (2016).

    Google Scholar 

  57. Powell, K., Mathy, A., Duguid, I. & Häusser, M. Synaptic representation of locomotion in single cerebellar granule cells. eLife Sci. 4, e07290 (2015).

    Article  Google Scholar 

  58. Mumford, J. A., Poline, J.-B. & Poldrack, R. A. Orthogonalization of regressors in fMRI models. PLoS One 10, e0126255 (2015).

    Article  Google Scholar 

  59. Karabatsos, G. Marginal maximum likelihood estimation methods for the tuning parameters of ridge, power ridge, and generalized ridge regression. Commun. Stat. Simulat. (2017).

Download references

Acknowledgements

We thank O. Odoemene, S. Pisupati and H. Nguyen for technical assistance and scientific discussions; H. Zeng for providing Ai93 mice; J. Tucciarone and F. Marbach for breeding assistance; A. Mills and P. Shrestha for providing GFP mice; T. Harris, S. Caddick and the Allen Institute for Brain Sciences for assistance with the Neuropixels probes; and N. Steinmetz, M. Pachitariu and K. Harris for widefield analysis code. Financial support was received from the Swiss National Science foundation (S.M., grant no. P2ZHP3_161770), the Pew Charitable Trusts (A.K.C.), the Simons Collaboration on the Global Brain (A.K.C., M.T.K.), the NIH (grant no. EY R01EY022979) and the Army Research Office under contract no. W911NF-16-1-0368 as part of the collaboration between the US DOD, the UK MOD and the UK Engineering and Physical Research Council under the Multidisciplinary University Research Initiative (A.K.C.).

Author information

Authors and Affiliations

Authors

Contributions

S.M., M.T.K. and A.K.C. designed the experiments. S.M. and S.G. trained animals and recorded widefield data. S.M. performed surgeries. M.T.K. and S.M. acquired two-photon data, designed the linear model and performed data analysis. A.L.J. recorded and spike-sorted Neuropixels data. A.K.C., M.T.K. and S.M. wrote the paper with assistance from S.G. and A.L.J. S.M. and M.T.K. contributed equally.

Corresponding author

Correspondence to Anne K. Churchland.

Ethics declarations

Competing interests

The authors declare no competing interests.

Additional information

Peer review information Nature Neuroscience thanks Mackenzie Mathis, Mala Murthy and the other, anonymous, reviewer(s) for their contribution to the peer review of this work.

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Integrated supplementary information

Supplementary Figure 1 Overview over cortical areas.

Shown are cortical areas based on the Allen common coordinate framework v.3. 1: Olfactory bulb (combined); 2: Frontal pole, cerebral cortex; 3: Prelimbic area; 4: Anterior cingulate area, dorsal part; 5: Secondary motor area; 6: Primary motor area; 7: Primary somatosensory area, mouth; 8: Primary somatosensory area, upper limb; 9: Primary somatosensory area, nose; 10: Primary somatosensory area, lower limb; 11: Primary somatosensory area, unassigned; 12: Supplemental somatosensory area; 13: Primary somatosensory area, trunk; 14: Primary somatosensory area, barrel field; 15: Ventral auditory area; 16: Anterior visual area; 17: Retrosplenial area, dorsal part; 18: Anteromedial visual area; 19: Rostrolateral visual area; 20: Dorsal auditory area; 21: Primary auditory area; 22: Retrosplenial area, lateral agranular part; 23: Posteromedial visual area; 24: Primary visual area; 25: Anterolateral visual area; 26: Posterior auditory area; 27: Lateral visual area; 28: Laterointermediate area; 29: Temporal association areas; 30: Postrhinal area; 31: Posterolateral visual area.

Supplementary Figure 2 Visual sign maps for all mice.

Shown are visual field sign maps for all trained animals, aligned to the Allen CCF. Mapped areas largely agreed with corresponding location of visual areas in the CCF.

Supplementary Figure 3 Auto- and cross-correlation and impact of sampling rate.

(a) Autocorrelation of the imaging data at different time shifts between −1.5 and 1.5 seconds at 30 Hz. Explained variance falls off relatively quickly with a time constant of 150 ms. High autocorrelations when shifting by a single frame also indicate that imaging at sampling rates above 30 Hz would not provide much additional information, potentially due to the kinematics of the GCaMP6f indicator. (b) Cross-correlation between the imaging data and the model prediction for time shifts between −1.5 and 1.5 seconds. Explained variance of the behavioral prediction falls off with a time constant of 245 ms, indicating that the model mostly fits slower fluctuations on the order of 100–200 ms instead of fast frame-frame changes. (c) cvR2 for a model consisting only of the previous frame (blue) or the behavioral model (red) at different sampling rates. At 30 Hz, the previous frame can predict a high degree of variance. However, predictive power is much lower at lower sampling rates. In contrast, cvR2 for the behavioral model increases at lower sampling rates. This demonstrates that the model does not benefit from autocorrelations in the data but benefits from averaging out fast fluctuations in the imaging data. The box shows the first and third quartiles, inner line is the median over 22 recordings. Whiskers represent minimum and maximum values.

Supplementary Figure 4 Unique explained variance for different expert groups.

Shown are cortical maps of unique model contributions for the right vision, right handle and nose variables. The left column shows the average over all recordings from 11 animals. All maps identified specific cortical areas that sensibly corresponded to their respective model variable. Maps were highly robust when averaging over all visual experts (6 mice, 12 recordings, middle column) or auditory experts (5 mice, 10 recordings, right column), respectively.

Supplementary Figure 5 GFP and GCaMP6s controls.

(a) Remaining variance in fluorescence after subtracting hemodynamic signals. Only ~10% of the variance remained in GFP controls (GFP), 38% in Ai93 mice (Ai93) and 89% in GCaMP6s expressing animals (G6s). This demonstrates that the hemodynamic correction accurately rejects most intrinsic activity while leaving calcium-related signals intact. The box shows the first and third quartiles, the inner line is the median. Box whiskers represent minimum and maximum values. (b) Absolute amount of remaining variance for individual pixels. Remaining variance in GFP controls was much lower as in GCaMP-expressing mice, across the dorsal cortex. (c) Cross-validated R2 of the full linear model. Conventions as in (A). Explained variance was lowest in GFP controls and highest in GCaMP6s animals, demonstrating that the widespread predictive power of the linear model is not explained by predicting hemodynamic signals. However, the model still accounted for ~21% of the variance in GFP animals, indicating that the hemodynamic correction is imperfect and the remaining fluorescence still contains a small but predictable component. (d) Absolute amount of predicted variance for individual pixels. Comparing absolute explained variance (instead of percentages in A and C) shows that, although a smaller percentage of fluorescence in GFP animals could be predicted, the absolute amount of predicted variance is extremely small compared to Ai93 mice. Absolute predicted variance was also much higher in GCamP6s-expressing animals, further demonstrating that the models success is due to accurate prediction of neural dynamics instead of intrinsic signals. (e) Unique model contribution maps for the right visual stimulus variable. No specific unique contribution was apparent in GFP mice but was clearly visible for Ai93 and GCaMP6s mice. Unique contributions were also well-localized to visual areas. (f) β-weights in V1. Visual responses are strongest in in GCaMP6s animals and absent in GFP controls. Shading denotes the SEM over sessions. g, h) Same as e, f) for the right handle variable. (a–h) (n = 4 GFP recordings, 22 Ai93 recordings and 4 G6s recordings.).

Supplementary Figure 6 Auditory rate discrimination task.

(a) Schematic of the rate discrimination task. Mice are presented with auditory click sounds on both sides and identify the target side that contains more clicks to obtain a water reward. Stimulus sequences were 1-s long and clicks were randomly distributed. (b) Discrimination performance of an example animal. Right choice probability increases with the number of rightward pulses. Animals performed the task with high accuracy and were between 90–95% correct with the easiest stimuli. Shown are means ± 95% convidence intervals. n=8000 trials (2000 trials per animal). (c) Psychophysical reverse correlation revealed time-points for which stimuli most strongly influence animal decisions. Positive weights predict rightward and negative weights leftward decisions. Weights were non-zero for the entire stimulus duration, demonstrating that animals integrated sensory evidence over time to perform the task. Shown are means ± SEM for 4 mice. (d) Maps of unique model contribution for different variable groups, similar to Fig. 4e. As in the main results, unique contributions from uninstructed movements were highest across dorsal cortex. Shown are averages over 40 recordings from 4 mice.

Supplementary Figure 7 Model performance without analog predictors.

(a) Cross-validated explained variance for a reduced model without any analog regressors. Model’s performance was lower than the full model in Fig. 3a but still predicted a large amount of variance. Averaged across cortex, the event kernel-only model predicted 30.8±0.2% (mean±SEM, n=22 sessions) of all variance. (b) Unique model contribution map for each variable group. (c) Explained variance for variable groups, averaged across cortical maps. Shown is either cvR2 (light green) or ∆R2 (dark green). The box shows the first and third quartiles, the inner line is the median over 22 sessions. Box whiskers represent minimum and maximum values. Even after removing all analog predictors, uninstructed movement contained the highest unique contributions across cortex. This demonstrates that their importance for predicting cortical activity is not just explained by including analog predictors such as the video variables.

Supplementary Figure 8 Controlling for interictal events.

(a) Scatter plots show distribution of peaks in cortical activity, averaged over cortex. Left: Example animal, raised on a DOX-diet. Peaks were of variable length and remained at prominence below 5%. Right: Example animal, raised on a standard diet. Clearly visible are peaks of short latency and high promince (red dots). (b) Interictal event probability for all mice. Circles show individual sessions (two per animal). Four out of five mice that were raised on standard (non-DOX) diet show potential interictal activity. (c) Example trace for removal of interictal activity using autoregressive interpolation. (d, e) Modeling results for all DOX-raised animals. Similar to Fig. 4c, d. The box shows the first and third quartiles, the inner line is the median. Box whiskers represent minimum and maximum values. n=7 mice. (f, g) Modeling results for all non-DOX-raised animals. Modeling results between DOX-raised and non-DOX-raised mice were highly similar, demonstrating that our results are not due to potential interictal activity in some of the mice. n=4 mice.

Supplementary Figure 9 Predicted single-neuron variance for individual model variables.

Shown is either all explained variance (cvR2, light green) or unique model contributions (∆R2, dark green) for individual model variables in each cortical area. The box shows the first and third quartiles, the inner line is the median over 5 animals per area. Box whiskers represent minimum and maximum values. Prev.: previous.

Supplementary Figure 10 Depth comparison and motion control for 2-photon data.

(a) Explained variance for groups of model variables at different cortical depths. Shown is either all explained variance (cvR2, light green) or unique model contributions (∆R2, dark green). Superficial recordings were made from 150–350 µm, infragranular recordings between 350–450 µm. All infragranular recordings were performed in areas ALM (2655 neurons) or MM (3907 neurons). The box shows the first and third quartiles, the inner line is the median over 5 animals. Box whiskers represent minimum and maximum values. (b) Explained variance of variable groups for individual neurons, sorted by full-model performance (light gray trace). Conventions as in Fig. 6e but excluding imaging frames that were translated more than 2 pixels in either X- or Y-direction, relative to a reference image. As in Fig. 6e, uninstructed movements were most important to predict single-cell variance, demonstrating that this is not due to motion of the imaging plane with animal movement. (c) Explained variance for individual model variables averaged over all neurons after excluding translated imaging frames as described above. Shown is either all explained variance (light green) or unique model contributions (dark green) for 10 animals. Conventions as in (A).

Supplementary Figure 11 Revealing hidden task-dynamics with PETH partitioning.

(a) Histogram of modulation indices for either the task (left), uninstructed (middle) or instructed (right) movement groups. Dashed lines indicate the respective indices for the three example cells in (B). In contrast to Fig. 7h, all cells were best explained by a combination of all three model groups. (b) Three example cells with distinct task-related dynamics (left column, green) that were uncovered by accounting for uninstructed (middle column, black) and instructed movements (right column, blue). In all cases, task-related dynamics were clearly distinct from the uncorrected PETHs (gray traces) and revealed different response features, especially during the stimulus and delay period. (c) Contribution of individual model variables to the PETH reconstruction. Shown is the absolute PETH modulation for each variable as a percentage of the total modulation by all model variables. Example cells are the same as cells 1–3 in Fig. 7. While the task cell has PETH contributions from many variables, the instructed cell is mostly affected by rightward licks. (d) Number of variables required to reach 25% of the total PETH contributions. Most cells require at least 3 variables to reach the threshold (red bars in (C)). This indicates that the PETH of most cells is modulated by a combination of model variables instead of being dominated by a single variable.

Supplementary information

Supplementary Figs. 1–11 and Supplementary Table 1.

Reporting Summary

Supplementary Video 1

Averaged cortical activity for all visual trials (n = 22 recordings). Colors show change in neural activity relative to the first second in the baseline period. Lines indicate location of cortical areas based on the Allen CCF.

Supplementary Video 2

Cortical maps of unique contribution for different model groups. Maps were generated for all frames at each time point in the trial. Bottom text denotes different episodes of the trial.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Musall, S., Kaufman, M.T., Juavinett, A.L. et al. Single-trial neural dynamics are dominated by richly varied movements. Nat Neurosci 22, 1677–1686 (2019). https://doi.org/10.1038/s41593-019-0502-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1038/s41593-019-0502-4

This article is cited by

Search

Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing