Abstract
Computer vision approaches have made significant inroads into offline tracking of behavior and estimating animal poses. In particular, because of their versatility, deep-learning approaches have been gaining attention in behavioral tracking without any markers. Here we developed an approach using DeepLabCut for real-time estimation of movement. We trained a deep neural network offline with high-speed video data of a mouse whisking, then transferred the trained network to work with the same mouse, whisking in real-time. With this approach, we tracked the tips of three whiskers in an arc and converted positions into a TTL output within behavioral time scales, i.e. 10.5 milliseconds. With this approach it is possible to trigger output based on movement of individual whiskers, or on the distance between adjacent whiskers. Flexible closed-loop systems like the one we have deployed here can complement optogenetic approaches and can be used to directly manipulate the relationship between movement and neural activity.
Significance statement Here we deploy a deep-neural network-based fast feedback method that can be used to reconfigure feedback to mice based on the movement of particular whiskers, or on the distance between particular whiskers. Our system generates feedback within 10.5 ms. Methods like the one we present here will steadily become part of the standard toolset for manipulating the interaction between the animal and its environment in behavioral time scales.
Footnotes
Authors report no conflict of interest.
1) European Union's Horizon 2020 research and innovation program and Euratom research and training program 20142018 (under grant agreement No. 670118 to MEL). 2) Human Brain Project (EU Grant 720270, HBP SGA1, ‘Context-sensitive Multisensory Object Recognition: A Deep Network Model Constrained by Multi-Level, Multi-Species Data’ to MEL). 3) Human Brain Project (EU Grant 785907/HBP SGA2 ‘Context-sensitive Multisensory Object Recognition: A Deep Network Model Constrained by Multi-Level, Multi-Species Data’ to MEL). 4) Human Brain Project (EU Grant 945539/HBP SGA3 ‘Context-sensitive Multisensory Object Recognition: A Deep Network Model Constrained by Multi-Level, Multi-Species Data’ to MEL). 5) Deutsche Forschungsgemeinschaft Grant No. 327654276 (SFB1315), Grant Nos. 246731133, 250048060, 267823436, & 387158597 to MEL.
This is an open-access article distributed under the terms of the Creative Commons Attribution 4.0 International license, which permits unrestricted use, distribution and reproduction in any medium provided that the original work is properly attributed.