Abstract
Here, we describe an automated optical method for tracking animal behavior in both head-fixed and freely moving animals, in real time and offline. It takes advantage of an off-the-shelf camera system, the Pixy camera, designed as a fast vision sensor for robotics that uses a color-based filtering algorithm at 50 Hz to track objects. Using customized software, we demonstrate the versatility of our approach by first tracking the rostro-caudal motion of individual adjacent row (D1, D2) or arc whiskers (β, γ), or a single whisker and points on the whisker pad, in head-fixed mice performing a tactile task. Next, we acquired high-speed video and Pixy data simultaneously and applied the pixy-based real-time tracking to high-speed video data. With this approach, we expand the temporal resolution of the Pixy camera and track motion (post hoc) at the limit of high-speed video frame rates. Finally, we show that this system is flexible: it can be used to track individual whisker or limb position without any sophisticated object tracking algorithm, it can be used in many lighting conditions including infrared (IR); it can be used to track head rotation and location of multiple animals simultaneously. Our system makes behavioral monitoring possible in virtually any biological setting.
- behavioral analysis
- closed-loop behavior
- head-fixed behavior
- real-time behavior
- sensorimotor integration
- whisker kinematics
Significance Statement
We developed a method for tracking the motion of whiskers, limbs, and whole animals in real time. We show how to use a plug and play Pixy camera to monitor the motion of multiple-colored objects in real time and post hoc. Our method has major advantages over currently available methods: we can track the motion of multiple adjacent whiskers in real time at 50 Hz and apply the same methods post hoc at a high temporal resolution. Our method is flexible; it can track objects with similar shape like two adjacent whiskers, forepaws, or even two freely moving animals. With this method, it becomes possible to use the phase of movement of particular whiskers or a limb to perform closed-loop experiments.
Introduction
A traditional approach to the study of neural function is to relate activity of a circuit to a distinct behavior. While methods for measuring and manipulating neural activity have become increasingly sophisticated, the ability to monitor and manipulate behavior in real time has not kept pace. Even today, despite the advancement in the methods developed to precisely track animal behavior such as eye movement or head direction of animal in real time at different contexts (Hölscher et al., 2005; Wallace et al., 2013), in some of the most sophisticated closed-loop behavioral electrophysiology and imaging systems, i.e., visual virtual reality, where motion of the treadmill or air-ball is used to remap the visual world, there is no direct report of the animal movement; the motion of the animal is tracked indirectly by monitoring the movement of the treadmill or the air-ball (Legg and Lambert, 1990; Dombeck et al., 2007; Harvey et al., 2009; Cushman et al., 2013).
To overcome these kinds of limitations in behavioral monitoring, we used the whisker system, a model sensory motor system in which many of the key advances in monitoring neural activity in vivo have been used, i.e., calcium imaging of neurons and dendrites in vivo, imaging activity of axons, whole-cell patching in behaving animals, etc. (Svoboda et al., 1997; Svoboda et al., 1999; Lee et al., 2006; Gentet et al., 2010; Petreanu et al., 2012). While the whisker to barrel cortex system is a model for investigations of sensory motor processes, it has one key limitation; whiskers are tiny and can be difficult to track in real time. In the last decade, a variety of approaches have been used for monitoring whisker movement during behavior (Hires et al., 2013; Hentschke et al., 2006; Zuo et al., 2011; Sofroniew and Svoboda, 2015). High-speed videography is one common approach (Carvell and Simons, 1990; Sachdev et al., 2001; Hartmann et al., 2003; Knutsen et al., 2005; Ritt et al., 2008; Voigts et al., 2008; Grant et al., 2009; Clack et al., 2012; Arkley et al., 2014; Voigts et al., 2015). Another approach is to use electromyography (EMG) (Carvell et al., 1991; Fee et al., 1997; Berg and Kleinfeld, 2003; Sachdev et al., 2003; Zagha et al., 2013). Alternatively, a single or an array of laser/infrared beam break detectors was used for tracking the position of a whisker or the movement of the animal (Bermejo et al., 1996; O'Connor et al., 2013). Each of these approaches has advantages and disadvantages. EMG provides real-time feedback, but it does not have the spatial resolution for monitoring the motion of any individual whisker (Carvell et al., 1991; Fee et al., 1997; Berg and Kleinfeld, 2003; Sachdev et al., 2003; Zagha et al., 2013). High-speed imaging has unmatched spatial-temporal resolution; it can be used for monitoring one or multiple whiskers at a time, but it is typically not used in real-time or in feedback mode (Knutsen et al., 2005; O'Connor et al., 2010; Diamond et al., 2008; Voigts et al., 2008; Gyory et al., 2010; Perkon et al., 2011). In addition, automated tracking algorithms used with high-speed videography are often inflexible, as most tracking algorithms are customized to track a distinct object in a very specific setting. Most of the automated algorithms for tracking objects with high-speed cameras cannot track whiskers or limbs in systems where the floor and the walls around and under the animal move (Nashaat et al., 2016).
In this study, we present a method that turns an off-the-shelf camera (helped along by customized software) into a versatile real-time optical tracking system for monitoring whiskers, limbs, or whole animals. We can quantify the location, trajectory, and speed of almost any part of the body or of the whole animal. The same camera and algorithm can be used for offline tracking of movement, with almost no limit to the temporal resolution. This system makes it possible to analyze large quantities of video data and to generate continuous wave form of movement.
Materials and Methods
Animals
All animal procedures were performed in accordance with the animal care committee’s regulations. Mice were maintained in a reverse day/night cycle environment throughout the course of the experiments. Eight adult female mice were surgically prepared for head restraint by attaching a head-post to the skull under ketamine/xylazine anesthesia (90 mg/10 mg/kg). In the 2 d after surgery, Buprenex analgesia (0.1 mg/kg) was administered and the animal health was monitored. Rely-X cement was used to affix the head-post to the skull (Applicaps, 3 Com; Andermann et al., 2013). In two animals, a lightweight detachable Styrofoam color ID was affixed to the head-post to enable tracking of the freely moving animal.
One to two weeks after surgery, animals were habituated to head-fixation on a stationary platform, or to head-fixation on a treadmill, or were allowed to explore a clear linear 42-cm-long × 9-cm-wide track made of Styrofoam. In subsequent days, animals were head-restrained for short periods of time, while individual whiskers were painted by dabbing Ultra-violet sensitive body paint (UV Glow) mixed with super glue. Mice were habituated to the coloring of whiskers and the placement of a piezo-film sensor at some fixed distance from the whiskers (Bermejo and Zeigler, 2000; Sachdev et al., 2001). Whisker contact with the sensor was rewarded with a drop of sweetened condensed milk. Mice were trained to move their whiskers in response to a sound cue (Fig. 1). Whisker contact of sufficient force against the piezo-film sensor elicited a reward (Fig. 1B). In the second task, animals were habituated to head-fixation while on a treadmill. The forepaws were painted with two different UV dyes one for each paw. For freely moving animals, a piece of multicolored Styrofoam (different colors combination for each animals) was glued to head-post and used for tracking mice in regular light conditions. In all paradigms, animals were water restricted and weights were monitored daily and maintained at >85% body weight.
Experimental setting
A Pixy camera (Charmed labs, Carnegie Mellon University) was equipped with a 10-30 mm f1.6 IR lens and connected to the USB port of a computer. Pixy uses an HSV (hue, saturation, and value) color-based filtering algorithm to track colored objects. The open-source camera software, PixyMon, was used to mark up the colored whiskers and limbs defining a distinct signature for each color. Color signatures were tuned to achieve consistent tracking without generating false positives (detecting wrong objects) or false negatives (detecting the object intermittently or sparsely).
Tracking software and importing data
PixyMon is the commercial computer software used to communicate with the Pixy camera. It is written using Qt language, which is an event-based C++ cross-platform framework widely used in GUI applications. PixyMon enables signature tuning, i.e., tuning the tracking of a colored object, via its configure dialogue tab. The tolerance of each signature can be optimized by adjusting a set of graphical sliders. The camera can learn up to seven distinct colors counting from “signature 1” up to “signature 7.” The user can either assign a signature as a “standard” signature where objects are detected based on a single color, or the user can assign a “color-code” signature in which detected objects consist of two or more adjacent colors in distinct sequence. The color-code signatures reduce false positives, as they limit the possibility that colors are confused with other similar objects in the camera view. In the color-code mode, PixyMon software reports the angle based on the position and rotation of two or more adjacent color. Here, we used the standard mode for tracking whiskers, the whisker pad, and limbs (Figs. 1-5) and use the color code for tracking the head rotation and location of the freely moving animal (Fig. 6).
Signature-mapper
We modified PixyMon to send coordinates over the network using user datagram protocol (UDP) to a new software that we have developed and called the signature-mapper. This software can receive coordinates from multiple simultaneously running instances of PixyMon. It can also be used to automatically compress the video data played back in slow motion uniformly after acquisition with high-speed camera.
The signature-mapper is linked via a serial port to Spike2 (or it can be linked to MATLAB or another python application via UDP or transmission control protocol), or to a file to be stored on disk. In its current implementation, the signature-mapper allows seven different output channels (from C1 to C7). The source code and the binaries for the modified PixyMon and the signature-mapper are available at: http://www.neuro-airtrack.com/pixy_paper/pixy.html, https://github.com/larkum-lab, RRID: SCR_014813.
System validation
The Pixy camera has a 50-Hz temporal resolution in real time. To measure the actual temporal resolution and delay from the Pixy camera to Arduino or Spike2/CED Power 1401 interface, we triggered a green LED with a TTL and turned it off at the first report of a signal from the camera. We recorded the timestamps of both LED trigger and the first serial message that reported that the LED turned on, from Pixy camera either directly through Arduino or indirectly through Pixy USB port connected to the PixyMon which sends the data to Spike2 via the signature-mapper software. We found that the time lag between triggering of the LED and reporting is ∼30 ms. In another test of the system, we used a colored object attached to rotary motor, where the frequency of movement could be altered between 5 and 20 Hz. This experiment showed that Pixy can be used to make complete wave form of motion at ∼8 Hz.
During whisker tracking in real time, there was a potential for false positives, or a false negative (missed frames). False positive frames usually develop when a colored object, a single painted whisker that can be reported as more than one signature (because of the angle or position of the colored whisker relative to the Pixy camera) is seen in two locations in the same frame. We excluded any frame which had more than one value for the same signature. Normally, this error is evident during real-time data collection and can be corrected by changing the lighting or recoloring the whiskers/limbs or the head of the animal. To correct for missed frames (false negatives), we use offline tracking and data synchronization (Fig. 3; see below).
Resampling high-speed videography and synchronization
Synchronizing data stream obtained by PixyMon from high-speed camera in slow motion depends on temporal resolution of high-speed camera and the replay speed of the movies in slow motion. The signature-mapper software uses the values of recorded and replayed frame rates to process the offline tracking data and to synchronize it with the real-time video. The experimenter inputs the rate by which the recorded video was slowed down while the software applies a simple mathematical formula to perform the compression for the data stream obtained offline to fit the real-time value of the video.
Data acquisition
Painted whiskers or limbs or color ID on the animal head showed continuous tracking without saturation or breakdown. Pixy adapts to a variety of light conditions, including dark-ultraviolet, infrared, incandescent (reddish hue), or fluorescent (bluish hue) light. The white balance for each lighting condition is automatically adjusted as the Pixy powers on. When light conditions change, the white balance can be reset by unplugging the Pixy camera or by pressing the reset button for 2 s. In dark light, we use no more than three colors. In IR light, a whisker was painted with fluorescent dye and tracked using illumination from an infrared light source (Thorlabs). On the treadmill, the same methodology was applied for tracking forepaws (one color for each paw). For freely moving animals, we tracked the head direction using multicolor signatures, called a color code with which object position and angle can be automatically tracked. For offline tracking, a Basler high-speed color camera (model number acA1920-155) was used to capture images at 155 Hz. The high-speed camera recordings were played back in slow motion on a screen while the Pixy camera was setup to track the colored objects off the screen. From day to day, the coordinates (units) can vary because of the positioning of the camera, the precise zoom used on the camera, and the angle of the camera. In the case of the β and γ whiskers, which are arc whiskers, there is considerable overlap in the position of the whiskers relative to the camera (Fig. 2).
Here, we use Spike2 (Cambridge Electronic Design) for data acquisition. A Spike2 script is used to transform the x, y, and angle text coordinates into waveforms. The Spike2 script is available online at: http://www.neuro-airtrack.com/pixy_paper/pixy.html, https://github.com/larkum-lab, RRID: SCR_014813.
Data analysis
The real-time data from Pixy was mapped to Spike2 channels. When combined with the timing of behavioral events it is possible to take single trial (touch triggered or go-cue triggered) data for two adjacent whiskers and to make average waveforms for all movement data for each whisker over multiple trials. To show that both the x and y coordinates could be monitored by Pixy, we sampled the x and y coordinates of limb position and mapped this to Spike2 channels. In freely moving animals, the head rotation angle and x/y coordinates of animal position were acquired into Spike2 channels and converted into a linear track of movement of the animal or into heat maps of the animal. For the heat maps, we constructed a two-dimensional histogram of pixels in each video frame and applied 100 rounds of spatial filtering, where each pixel’s value was recomputed as the mean value of the pixel and each of its adjacent pixels (n = 8). Finally, high-speed video acquired at 150 Hz was played back at 6 Hz, and Pixy was used to capture the movement of whiskers into a Spike2 channel.
Results
We used the Pixy-based system on head-fixed mice (n = 6). Five mice had their whiskers painted with UV-fluorescent paint and one mouse had both forelimbs painted (see Materials and Methods). A high-speed Basler camera and a Pixy camera were positioned to track two whiskers (Fig. 1A). In this paradigm, mice were conditioned to whisk to contact a piezo-film sensor after a sound go-cue turned on (Fig. 1B). To ensure that the painted whiskers were used in the contact task, the large whiskers rostral to the painted ones were trimmed off. We first determined whether the real-time whisker motion captured in video frames matched the position data recorded in real time (Fig. 1C). Video synchronized to the real-time data provided by Pixy indicated that both the absolute (real) and relative (x and y coordinates in the Pixy frame) whisker positions were tracked accurately (Fig. 1C, middle). In frame 1, the two painted whiskers are close to each other, in frame 2, both tracked whiskers are further apart. The total movement (in 20 ms) of the two whiskers is reflected in the length of the lines (Fig. 1C, middle), and the location of the red and green traces (lines) reflects the position of the whiskers in the two frames.
Next, we used these methods to track two adjacent whiskers (Fig. 2A, Video 1). The D2 and D1 or the β and γ whiskers were tracked in the course of five cue-triggered contacts. The mouse used the D2 or the β whisker to touch the piezo-film sensor. These five contact trials show that at rest and during contact with the piezo-film sensor, the position of D2 whisker rarely overlapped (<1 mm) with the D1 whisker (at least at the point where the two whiskers were painted). While the two whiskers position was distinct and nonoverlapping, the motion of the whiskers was in phase with each other. In contrast, when the arc whiskers (β and γ) were tracked (Fig. 2A, right), the whiskers showed considerable overlap in the rostro-caudal position. These data indicate that the spatial location of the whiskers can be accurately tracked. Next, we generated whisker touch-triggered averages of movement of the two painted whiskers in each animal (Fig. 2B). These experiments show that the whisker that touched the sensor (D2 or β) moved to a greater extent, i.e., there is a larger deviation from rest on average for the whisker used to elicit touch-triggered reward.
To examine whether we could use these methods to track the motion of a single whisker over days of training, we painted the B2 whisker each day and tracked the performance of a single mouse. On d 1 (Fig. 2C, left), the average sound cue-triggered whisker movement of the B2 whisker was minimal, but by d 9 of training, the B2 whisker moved immediately after the go-cue turned on (Fig. 2C, right). The whisker movement data for these days could also be aligned to the timing of contact; this also shows a change from d 1 to 9, in the average rate of movement, as the B2 whisker makes contact with the piezo-film (Fig. 2D).
The real-time temporal resolution of 50 Hz is borderline for the use of the Pixy camera for fast movements of the body, fast movements that include whisking, which in mice can reach 25 Hz. We therefore developed and validated another approach, an automated, offline, slow-motion approach using an additional high-speed video camera that is often used to faithfully track whisker motion. The recorded high-speed video behavior was played back on a computer monitor in slow-motion, and a Pixy camera was positioned in front of the monitor to track the colored whiskers (Fig. 3A, Video 2). For a fraction of cue-triggered trials, we compared the Pixy camera tracked slow-motion data to the simultaneously acquired real-time data (Fig. 3B). Surprisingly, the real time and the offline slow-motion waveforms are qualitatively similar, the position of the two whiskers (top traces are from one whisker bottom from another; Fig. 3B) does not overlap at rest or during contact, and the envelope and duration of movement of the adjacent whiskers looks similar in both conditions. In another experiment, we tracked two points on the whisker pad, one just under the D1 whisker and a second one under an A row whisker, and a single whisker, the D1 whisker in both real time and post hoc at 200 Hz (Fig. 3C). The five real-time and the five slow-motion epochs of the same trials shown here have a few elements that should be noted: (1) the protraction to contact begins at different positions on each of the five trials, and this is evident in both real time and post hoc slow-motion analysis; (2) pad motion does not quite capture the difference in set point from trial to trial; (3) whisker motion is evident when the animal is not whisking in both the real-time and slow-motion data (arrow heads point to deflection in the traces) but is clearer in the slow-motion data (Fig. 3C, right); and (4) the slow-motion data contains more high frequency components, but the envelope of motion is being captured in real-time and in slow-motion data (Fig. 3B,C, bottom). Taken together, this implies that for some purposes, the Pixy camera approach is appropriate. But the higher temporal resolution tracking of the offline video shows that the high frequency components of the movement are not captured in real time by the Pixy camera.
To examine whether this method can be extended to infrared light condition (invisible to rodents), we painted a whisker with the same UV body paint, but instead of using UV dark light or regular illumination, we illuminated the whisker with infrared light. For proper IR illumination of just the whisker, the angle of the infrared light was key: the IR light was positioned under the Pixy camera and directed at the mouse whisker pad from the side. A single, painted whisker was tracked using a Pixy camera (Fig. 4, Video 3). Turning the infrared light off, removed all position information in the output. The text marks and the y position information were no longer generated and were no longer evident as a wave form. When the IR light was turned back on the real-time whisker, motion was reacquired and tracked without any additional adjustment.
To demonstrate the flexibility of the Pixy camera system, we used it to track both forepaws of mice on a treadmill. The paws were painted with different colors, and the Pixy camera was positioned at the height of the forepaw of a mouse (Fig. 5, Video 4). In this configuration, we tracked the position of the treadmill, the velocity of the treadmill, and the up and down motion of each forepaw as the animal moved on the treadmill. Here, it is easy to see the alternating up and down motion of each limb as the animal moves forward on the treadmill.
Finally, we used Pixy to track head rotation and x/y coordinates of freely moving animals position in a 42 × 9 cm wide box in real time (Fig. 6A,B, Video 5). The moment by moment changes in head angle and animal location data (x and y coordinates) can be transformed into wave form (Fig. 6A) where F1 (related to the vertical position of the animal in frame 1 on the right) is at the bottom and has a value close to zero. In frame 1, the animals head angle is horizontal, in frame 2, the angle rotates by ∼70°, in frames 3 and 4, the angle is rotated by 180° (compared with frame 1; Fig. 6A). The side to side position of the animal changes, with the animal sometimes hugging the right side (frames 1, 3), the left side (frame 2) or is roughly in the middle of the box. The position of the animal can be traced at 50 Hz (Fig. 6B), and a heat map of the animal location in the box over 3 min of tracking can be constructed. In addition to tracking the location of individual animals, Pixy can be used to track multiple color IDs affixed to the animal head (Fig. 6C), thus simply and flexibly tracking one or multiple distinct freely moving animals.
Discussion
This study demonstrates the utility of a color tracking camera that can be used for rapid real-time tracking of two adjacent whiskers, limbs, or even multiple animals. The method is flexible; it can work in various lighting conditions, and it can be used for real-time data acquisition and for automated tracking.
While earlier work in the whisker system has successfully used high-speed imaging, and EMG to detect motion of the whisker pad or of individual whiskers, these methods have limitations and advantages mentioned in the introduction. Aside from being easy to use and inexpensive, the Pixy method has key advantages over other methods (highlighted in Table 1), foremost among them is that Pixy is versatile and can be used for tracking almost any colored object, one or multiple distinct whiskers, points on the whisker pad, limbs, or even whole animals, in real time. It is flexible enough to be rapidly reconfigured for monitoring any part of the body, multiple body parts, and even the whole animal. Furthermore, Pixy is an open-source tool, where almost every aspect of the process the data stream, the, PixyMon software, the objectives used, even the lighting, and coloring are accessible and modifiable.
Most other methods are not nearly as flexible: videography is not commonly used in real time; EMG cannot be used for single whisker tracking; and optoelectronics, IR beam breaking methods, can be used only in designated locations (Table 1). Most earlier methods are not versatile enough and have not currently been used for any level of individual whisker or whisker-combined with whisker-pad tracking in real time. The Pixy approach has many advantages over other methods, but it also has some drawbacks. First, is that color is necessary and must be visible on the animal. Coloring, i.e., painting, adds some weight to a whisker and requires that the animal be habituated to the repeated application of body paint on animal’s limbs or whiskers. In addition, using a color-filtering algorithm limits the use of the system in infrared light, where Pixy can be used to track only one object. This limitation can be overcome by adding more than one Pixy camera to track each limb, or track a single whisker on each side of the face. Another limitation of the Pixy system is that it does not automatically provide a frame by frame update, rather it generates a serial time stamp of the tracked object. This limitation can be overcome by using TTL-triggered image capturing methods. Finally, another limitation is the temporal resolution of 50 Hz, where the actual resolution can be lower, depending on the configuration of the acquisition system. This temporal limit can be overcome post hoc. For studies where it is necessary to monitor higher frequency movement (>∼50 Hz), the Pixy camera can still be used to automatically track motion in slow-motion videos. A major element of this experimental design is that the fast movements missed in real time can be recaptured for analysis. Furthermore, key events (e.g. object contacts, etc.) can be still be tracked online using the Pixy camera during the behavior and can be used offline to quickly direct the researcher to important parts of the high-speed video images.
The advantage of the color-based system over the earlier automatic tracking software packages (Knutsen et al., 2005; Diamond et al., 2008; Voigts et al., 2008; Gyory et al., 2010; O'Connor et al., 2010; Perkon et al., 2011; Voigts et al., 2015) is that tracking depends on colors, where within some limits, the changes in lighting, the presence of motion under the whiskers, around the animal, and even changes in focus are less relevant than in most high-speed video imaging experiments. With the Pixy-based method, it becomes possible to noninvasively, flexibly, and inexpensively configure experiments where motion or location of one or more whiskers, limbs, or even the movement of the animal is used as feedback to trigger rewards, optogenetic signals, or even to change the real or virtual environment around the animal (Nashaat et al., 2016).
While our methods are by no means the first using color filtering, the range of tracking used in the work presented here, from tracking adjacent whiskers, to tracking freely moving animals, with little essential change in algorithm is unique and makes our methods almost universally applicable to a variety of settings and species (Bobrov et al., 2014; Cheung et al., 2014; Varga and Ritzmann, 2016).
Acknowledgments
Acknowledgements: We thank the Charité Workshop for technical assistance, especially Alexander Schill and Christian Koenig. We also thank members of the Larkum lab and, in particular, Christina Bocklisch, Guy Doron, Albert Gidon, Naoya Takahashi, and Keisuke Sehara for useful discussions about earlier versions of this manuscript.
Footnotes
The authors declare no competing financial interests.
This Project has received funding from the European Research Council (ERC) under the European Union's Horizon 2020 research and innovation programme (670118). This project also has received funding from Human Brain Project (720270 — HBP SGA1). Also, this research received support from Marie Curie fellowship, Einstein stiftung, DFG, and Neurocure Center for Excellence.
- Received August 17, 2016.
- Revision received December 31, 2016.
- Accepted January 10, 2017.
- Copyright © 2017 Nashaat et al.
This is an open-access article distributed under the terms of the Creative Commons Attribution 4.0 International, which permits unrestricted use, distribution and reproduction in any medium provided that the original work is properly attributed.