Abstract
One of the principal functions of the brain is to control movement and rapidly adapt behavior to a changing external environment. Over the last decades our ability to monitor activity in the brain, manipulate it while also manipulating the environment the animal moves through, has been tackled with increasing sophistication. Yet, our ability to track the movement of the animal in real time has not kept pace. Here we use a Dynamic Vision Sensor (DVS) based event-driven neuromorphic camera system to implement real-time, low-latency tracking of a single whisker that mice can move at ∼ 25 Hz. The customized DVS-system described here converts whisker motion into a series of events that can be used to estimate the position of the whisker and to trigger a position-based output interactively within 2 milliseconds. This neuromorphic chip based closed-loop system provides feedback rapidly, and flexibly. With this system, it becomes possible to use the movement of whiskers or in principal, movement of any part of the body to reward, punish, in a rapidly reconfigurable way. These methods can be used to manipulate behavior, and the neural circuits that help animals adapt to changing values of a sequence of motor actions.
Significance statement Here we implemented a method for tracking and reacting to movement in real time at low latency i.e. in 2 milliseconds. We use a neuromorphic camera chip to track movement of a whisker and generate an output based on whisker position. With training, mice learn to move whiskers to virtual target locations. Combined with the recent sophisticated techniques for monitoring and manipulating brain activity, methods like ours can be used to manipulate behavior or neural circuits that help animals adapt to changing values of a sequence of motor actions.
Footnotes
The authors report no conflict of interest.
The following funding sources have supported this project: 1) Deutsche Forschungsgemeinschaft (Grant No. 2112280105 to MEL; Grant No. LA 3442/3-1 & Grant No. LA 3442/5-1 to MEL). 2) Funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) – Project number 327654276 – SFB 1315. 3) European Union's Horizon 2020 research and innovation program and Euratom research and training program 2014–2018 (under grant agreement No. 670118 to MEL). 4) Human Brain Project (EU Grant 720270, HBP SGA1 & SGA2, ‘Context-sensitive Multisensory Object Recognition: A Deep Network Model Constrained by Multi-Level, Multi-Species Data’ to MEL). 5) Einstein Stiftung.
This is an open-access article distributed under the terms of the Creative Commons Attribution 4.0 International license, which permits unrestricted use, distribution and reproduction in any medium provided that the original work is properly attributed.
Jump to comment: