A novel low-noise movement tracking system with real-time analog output for closed-loop experiments
Introduction
Compared to manual administration of experiments, automated experimental designs have obvious advantages in terms of throughput, accuracy, reproducibility, experimenter effort, and robustness to human error. Furthermore, automated systems enable closed-loop experiments such as manipulating neuronal activity according to animal behavior (Wiener et al., 1989). In behavioral experiments, especially those targeting spatial memory and navigation, the most important extrinsic variables are the head location and orientation of the animal (Grieves et al., 2016; Moser et al., 2015; O’Keefe and Dostrovsky, 1971). Multiple approaches have been used to obtain these movement features including a grid of motion sensors (Opto-Varimex system; Columbus Instruments, USA) and piezoelectric sensors on the floor (Flores et al., 2007). Yet arguably the cheapest, most robust, and most widely-used method is recording with a single overhead camera. This approach can be used with or without markers attached to the subject (Maghsoudi et al., 2017; Mathis et al., 2018); clearly, markers increase target salience at the cost of limiting the number of targets and reducing freedom of movement. The camera-based technique has been used for over thirty years (Skaggs et al., 1998; Wiener et al., 1989), and multiple commercial (e.g. ANY-maze; CinePlex, Plexon; EthoVision, Spink et al., 2001; OptiTrack, NaturalPoint Inc. U.S; NetCom API, NeuraLynx, U.S.) and open source (e.g. Bonsai, Buccino et al., 2018; Lopes et al., 2015; DeepLabCut, Mathis et al., 2018; MouseMove, Samson et al., 2015; Pyper) systems are available. These tools are useful for behavioral logging and offline analyses but typically do not support any online functionality. Those that do, either enable only limited real-time output (collision of subject position with a given region-of-interest) or are tailored to a specific platform. In sum, presently-available tracking tools are suboptimal for low-latency closed-loop experiments that rely on detailed kinematics such as orientation, velocity, or combinations thereof.
In the present work, we developed a marker-based system (Fig. 1A) that provides accurate high-resolution (100 samples/s, 4.5 mm/pixel) real-time (28 ± 3 ms delay) feedback of animal position, orientation, and their temporal derivatives (velocity and angular velocity). These features are conveyed as real-time analog (0–5 V with 12-bit resolution) signals that can be easily integrated with other variables such as neuronal recordings and be used for closed-loop manipulations such as electrical or optogenetic stimulation. Furthermore, the system outputs digital signals to indicate whether the animal is within user-defined regions of interest. Implementing the system is simple, and involves downloading an open-source software and assembling low-cost hardware circuity.
The system (Fig. 1B) for tracking a subject (e.g. an animal) consists of three blocks: (1) a camera; (2) a PC that runs the software (“Spotter”) and controls the downstream hardware system; and (3) custom hardware (“Movement Controller”, MC), outputting low-noise digital and analog signals. In the implementation described here, a Digital Signal Processor (DSP; RX8, Tucker-Davis Technologies) and a precision Current Source (CS; Stark et al., 2012) are used for closing the loop by applying intra-cortical illumination, resulting in neuronal activation of the tracked rodent. In this use case, the rodent has an implanted head-stage with two color-markers (brightly-painted blobs/LEDs). While the system works well in both light and dark conditions, each marker must have a color that is clearly distinct from the background and from other markers. The software is a Python-based application that uses a modular structure consisting of two main parts: a command line application, and an optional graphical user interface (GUI).
To provide modularity, three distinct levels of tracking are defined: Markers, Objects, and Regions of Interest (ROIs). Markers are the elementary tracking units, representing a color blob or an empty contour such as an LED. Markers are defined by a set of four parameters (hue, saturation, value, and size) that are used by the detection algorithm (Algorithm 1). An Object is composed of linked Markers and has up to six features that can be routed to analog outputs: (1) x position; (2) y position; (3) orientation ; (4) speed ; (5) movement direction ϕ; and (6) angular velocity . The first three features are first-order in the sense that they can be determined from a single frame, whereas the last three are second-order, based on the temporal derivative of multi-frame data (Fig. 1C). Note that and are defined only for multi-marker objects.
Although the detection algorithm typically results in a highly accurate single-frame color-blob detection rate, continuity is not imposed, and thus multi-frame tracking is typically erratic. To account for these and other sources of noise, we designed an adaptive denoising and location estimation algorithm. The procedure (Algorithm 2) is based on the Kalman filter (Kalman, 1960), a real-time data fusion procedure that combines noisy measurements with estimates, resulting in smoother and more accurate output.
The last level of modularity is a Region of Interest (ROI), which is a part of the image described by one or more geometric shapes (circles, lines, and/or squares). Linking an ROI to an Object will continuously check for collision between Object location and the ROI. The real-time state of this check can be emitted as a binary (digital) output.
Section snippets
Results
The system can track in real-time (28 ± 3 ms delay) the simultaneous movement of up to four Objects (e.g. behaving animals) and maintain up to sixteen ROIs (see Application I below). With a camera, the software can work as a stand-alone logger that may record (and later play back) 2D kinematics. Integration with the MC yields real-time analog and digital outputs available for data acquisition (DAQ) and/or processing on the fly. The generic MC design (Fig. 7) allows scaling the number of tracked
The software: spotter
Spotter is a modular library written in Python 2.7. The software grabs a new frame from the camera utilizing OpenCV 2.4, an open-source machine vision library (Bradski, 2000). The frame then passes through a processing module; results are communicated via a serial interface to an external microcontroller. User input can be added through the command line or through the GUI, built with PyQt4 (Qt 4.8, The Qt Company).
Finding marker positions
Algorithm 1is based on color segmentation. It identifies user-defined Markers on
Discussion
We described a modular and flexible open-source software and hardware system that enables real-time (28 ± 3 ms delay, 100 fps) low-noise tracking of the 2D kinematics of multiple objects. The system was applied to the task of tracking a freely-moving rodent equipped with on-head LEDs. Using two adaptive algorithms, the system maintained stable object tracking even when one or both LEDs were obscured, and when noise and reflections were present in the field of view. This approach eliminates the
Author contributions
N.G. wrote the software, developed and built the hardware, developed Algorithm II, collected and analyzed data, and wrote the manuscript. R.E. developed Algorithm I, wrote the software, and developed the hardware. E.S. conceived and supervised the project, developed Algorithm II, built optoelectronic devices and implanted animals, analyzed data, and wrote the manuscript.
Competing financial interests
The authors have no competing financial interests to disclose.
Acknowledgements
We thank Lisa Roux and Leore Heim for testing the system and for providing feedback on its limitations. This work was funded by a CRCNS grant (#2015577) from the United States-Israel Binational Science Foundation (BSF), Jerusalem, Israel, and the United States National Science Foundation (NSF); and by the Israel Science Foundation (ISF; grant #638/16).
References (21)
- et al.
The hippocampus as a spatial map. Preliminary evidence from unit activity in the freely-moving rat
Brain Res.
(1971) - et al.
3-D motion capture for long-term tracking of spontaneous locomotor behaviors and circadian sleep/wake rhythms in mouse
J. Neurosci. Methods
(2018) - et al.
The EthoVision video tracking system—a tool for behavioral phenotyping of transgenic mice
Physiol. Behav. Mol. Behav. Genetics Mouse
(2001) - et al.
Pyramidal cell-interneuron interactions underlie hippocampal ripple oscillations
Neuron
(2014) - et al.
Adaptive Adjustment of Noise Covariance in Kalman Filter for Dynamic State Estimation
(2017) The OpenCV Library. Dr Dobbs J. Softw. Tools
(2000)- et al.
Open source modules for tracking animal behavior and closed-loop stimulation based on Open Ephys and Bonsai
J. Neural Eng.
(2018) - et al.
Pattern recognition of sleep in rodents using piezoelectric signals generated by gross body movements
IEEE Trans. Biomed. Eng.
(2007) - et al.
Place cells on a maze encode routes rather than destinations
eLife
(2016) A new approach to linear filtering and prediction problems
J. Basic Eng.
(1960)