Skip to main content

Main menu

  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Blog
    • Collections
    • Podcast
  • TOPICS
    • Cognition and Behavior
    • Development
    • Disorders of the Nervous System
    • History, Teaching and Public Awareness
    • Integrative Systems
    • Neuronal Excitability
    • Novel Tools and Methods
    • Sensory and Motor Systems
  • ALERTS
  • FOR AUTHORS
  • ABOUT
    • Overview
    • Editorial Board
    • For the Media
    • Privacy Policy
    • Contact Us
    • Feedback
  • SUBMIT

User menu

Search

  • Advanced search
eNeuro
eNeuro

Advanced Search

 

  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Blog
    • Collections
    • Podcast
  • TOPICS
    • Cognition and Behavior
    • Development
    • Disorders of the Nervous System
    • History, Teaching and Public Awareness
    • Integrative Systems
    • Neuronal Excitability
    • Novel Tools and Methods
    • Sensory and Motor Systems
  • ALERTS
  • FOR AUTHORS
  • ABOUT
    • Overview
    • Editorial Board
    • For the Media
    • Privacy Policy
    • Contact Us
    • Feedback
  • SUBMIT
PreviousNext
Research ArticleMethods/New Tools, Novel Tools and Methods

Pixying Behavior: A Versatile Real-Time and Post Hoc Automated Optical Tracking Method for Freely Moving and Head Fixed Animals

Mostafa A. Nashaat, Hatem Oraby, Laura Blanco Peña, Sina Dominiak, Matthew E. Larkum and Robert N. S. Sachdev
eNeuro 14 February 2017, 4 (1) ENEURO.0245-16.2017; https://doi.org/10.1523/ENEURO.0245-16.2017
Mostafa A. Nashaat
1Neurocure Cluster of Excellence, Humboldt Universität zu Berlin, Germany
2Berlin School of Mind and Brain, Humboldt Universität zu Berlin 10099, Germany
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Mostafa A. Nashaat
Hatem Oraby
1Neurocure Cluster of Excellence, Humboldt Universität zu Berlin, Germany
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Laura Blanco Peña
1Neurocure Cluster of Excellence, Humboldt Universität zu Berlin, Germany
3Erasmus Program, Faculdad de Biologia, Universidad de Barcelona, Barcelona 08007, Spain
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Laura Blanco Peña
Sina Dominiak
1Neurocure Cluster of Excellence, Humboldt Universität zu Berlin, Germany
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Matthew E. Larkum
1Neurocure Cluster of Excellence, Humboldt Universität zu Berlin, Germany
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Matthew E. Larkum
Robert N. S. Sachdev
1Neurocure Cluster of Excellence, Humboldt Universität zu Berlin, Germany
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Robert N. S. Sachdev
  • Article
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF
Loading

Article Figures & Data

Figures

  • Tables
  • Movies
  • Figure 1.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 1.

    A, Setup design. Head-fixed mice are acclimatized to whisker painting and trained to use their whiskers to contact a piezo-film touch sensor. A Pixy camera was used to track whiskers in real time (left), a high-speed color camera was used simultaneously to acquire data. B, Paradigm for whisker task. A sound-cue initiates the trial. The animal whisks one of the two painted whiskers into contact with a piezo-film sensor, and if contact reaches threshold, the animal obtains a liquid reward. There is a minimum inter-trial interval of 10 s. C, Capturing whisker motion in real time. The movement and location of the D1 (green, S = 1, signature 1) and D2 (red, S = 2, signature 2) whiskers shown at two time points, frame 1 and frame 2 (below). The wave form of whisker data reflects the spatial location and the dimensions of the tracked box around the whisker. The waveforms in the middle show the movement of the two whiskers, toward and away from each other.

  • Figure 2.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 2.

    A real-time automated multiple whiskers tracking. A, Pixy data from D1 and D2 whiskers (left, raw and smoothed) or β and γ whiskers (right, smoothed), as a mouse performs five auditory go-cue-triggered trials. A mouse moves a whisker into contact with a piezo-film sensor (bottom). Contact with the sensor triggers a reward. The cue onset and the reward trigger times are marked below the whiskers movement traces. Note that the spatial location of the D1 and D2 whiskers is distinct; the position of the two whiskers rarely overlap. In these trials, the distance between the two whiskers ranged from approximately 2 to 10 mm (distances converted into arbitrary units that denote spatial location). B, Average position during task performance. The D1 and D2 whiskers move differently (left): the average position of the two whiskers at rest is different (before zero), and the average position of the two whiskers at contact is different (at zero). The D2 whisker, which contacts the piezo-film sensor and is rostral to the D1 whisker, moves more than the D1 whisker. In contrast, the two arc whiskers’ position overlaps at rest and at contact, but even here, the average motion of the whisker used to make contact with the sensor is different from the motion of the adjacent whisker. C, Tracking performance by tracking whisker movement over days. The performance of an animal trained in the go cue task was monitored by monitoring the motion of the B2 whisker over days of training. The go-cue-triggered motion of the B2 whisker is task related by d 9 of training (compared with the imperceptible motion of the same whisker after the cue on d 1). D, The contact-triggered motion is also faster and larger by d 9, compared with its motion on d 1 (on the left).

  • Figure 3.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 3.

    Post-hoc automated tracking. A, Diagram of a Pixy camera capturing whiskers motion previously recorded with a high-speed video camera and played back in slow motion on a monitor. B, Comparison of the high-fidelity signature of the D1 and D2 whiskers (top and bottom), recaptured automatically by the Pixy camera in slow motion (orange) with the data acquired in real time (black). C, Motion of the two points on the whisker pad and one whisker are tracked in real time and post hoc in slow motion. The motion of the D1 whisker and the pad-point under the D1 whisker, and the second pad point under the A2 whisker could be tracked easily in real time and the same trials could be examined post hoc with analysis of the slow-motion playback of high-speed video data. The motion of the whisker pad appears to be a filtered version of the whisker motion. The motion of the D1 whisker in both real time (left) and post hoc (right) reveals differences in the set-point of protraction on each of the five trials, but real-time pixy data captures the entire envelope of both the whisker and the pad motion (bottom, expanded record of trial above on right).

  • Figure 4.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 4.

    Single whisker tracking in infrared light. Top, Pixy image of whisker painted with yellow UV light sensitive paint, illuminated with infrared light only and automatically tracked in real time. Bottom, Output from Pixy camera showing periods with infrared (IR ON) and without infrared (IR OFF) illumination.

  • Figure 5.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 5.

    Pixy tracking of two limbs. A, schematic of a mouse head-fixed on a treadmill and the two forepaws, one painted green, the other painted red are tracked with a Pixy camera. B, The positon and velocity of the treadmill and the alternating up and down motion of the limbs are tracked in real time.

  • Figure 6.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 6.

    Tracking head rotation and location of freely moving animals. A, The head rotation (top), x (middle), and y (bottom) coordinates of animal position were simultaneously tracked. Four time points corresponding to the four frames (right) are shown, where the animals head direction, and position in the box change from moment to moment. B, The animal’s position over 3 min was tracked and a heat map of the preferred location was created; red, more time and blue, less time. C, The location of two animals in the same enclosure can be distinctly tracked, including each animals head rotation, and position. Pixy tracking is shown by the boxes around the animal’s head.

Tables

  • Figures
  • Movies
    • View popup
    Table 1.

    Comparison of videography, optoelectronic, and EMG methods to Pixy

    Bermejo et al. (1996)Knutsen et al. (2005)Voigts et al. (2015)Ritt et al. (2008)O'Connor et al. (2010)Gyory et al. (2010)Perkon et al. (2011)EMGPixy
    (our method)
    1Tracking principleOptoVideoVideoVideoVideoVideoVideoMuscleOpto/video
    2Spatial elementSingle pointMultiple pointsMultiple pointsMultiple pointsMultiple pointsMultiple pointsMultiple pointsNAMultiple points
    3Real timeYesNoNoNoNoNoNoYesYes
    4Individual whiskerYesYesYesYesYesNoNoNoYes
    5Number of single identifiablewhiskersOne whisker on each sideOne whisker on each sideOne whisker on each sideThree whiskersFive whiskersN/AN/ANATwo whiskers (up to seven in principle)
    6Whisker removalYesYesYesYesYesN/AN/ANoNo
    7LimitationWhisker thicknessContrast andresolutionContrast and resolutionContrast and resolutionContrast and resolutionContrast and resolutionContrast and resolutionNAIllumination and color
    8Method showsSingle whiskerSingle rowC1-4 whiskersSingle whiskersMultiple whiskersTwo rowsFull whisker padWhisker padTwo whiskers/one whisker and two pad/two paws
    9Head trackingNoYesYesYesNoYesYesYesYes
    10Head tracking requirementN/AAdditional light source for the eyeTip of noseContour edge/whisker baseN/ANo requirementNo requirementWire in muscleMarker glued to head
    11Compatible in IRYesYesYesYesYesYesYesYesYes (single whisker)
    12Algorithm flexibilityYes (not automatic)NoNoNoNoNoNoYes (with wires)Yes
    13Unrestrained animalNoYesYesYesNoYesYesYesYes (whole animal)
    • Here we compare 13 different features of 7 earlier tracking methods, including optoelectronic (Opto), electromyography (EMG), and other whisker tracking algorithms combined with high-speed videography to our Pixy based method. The elements that we compared here: 1) Tracking principle: Tracking algorithms based on videography, optoelectronic methods like beam breaking, EMG or color. 2) Spatial coordinate system: Beam breaking has a distinct (single or multiple) spatial coordinate, while videography can track over multiple spatial locations. 3) Real-time at any frequency. 4, 5, 6) Number of objects tracked: A single whisker, or multiple individual whiskers, with or without plucking or removing whiskers. 7) Limiting element of each method: Lighting, contrast, resolution and length of whiskers for videography, or color and painting for Pixy, 8) Output: Single whisker, multiple whisker or whisker and whisker pad. 9, 10) Head direction tracking method: possibility, and whether the eye need or tip of the nose or a color needs to be tracked. 11) Ability to tack in infrared red light: All the high speed cameras can work with infrared light, as can EMG and optoelectronic methods. The pixy camera is limited in this context because it can only be used to track a single spatially distinct point with a pixy camera. 12) The flexibility in tracking multiple body parts: Cameras can be used for tracking any object, but optoelectronic methods, and EMGs, and even automated tracking video systems have to be optimized or positioned for tracking the object of interest. 13) The ability to use the system in unrestrained animals.

Movies

  • Figures
  • Tables
  • Video 1.

    Real-time tracking of D1 and D2 whiskers. Left panel shows the real-time data transmitted from Pixy to data files. The top right panel shows the simultaneously acquired high-speed video of the two whiskers, and the bottom right shows Pixy view. The D2 whisker is painted red and shows up as the red waveform on the top left, the D1 whisker is painted green and is the green waveform on the left. The yellow/black boxes are the text mark indicators, showing that Pixy is transmitting data in real time via the USB interface. The positions of the two whiskers do not overlap. They are not at the same point in space at the same time, in the videos or in the waveforms. The set point of both whiskers changes from moment to moment (time 5 s in the video, to 8 s in the video). The actual distance moved in millimeters can be seen in both the high-speed and the Pixy video.

  • Video 2.

    Pixy analysis of slow-motion video data. The color high-speed video can be played back in slow motion (left panel). Pixy camera and Pixymon (middle panel) can be used to track the position of the two whiskers and the data can be extracted into a data file (right panel).

  • Video 3.

    Pixy in infrared illumination. A single painted whisker shown in the video on the right is tracked in real time (left panel) with infrared illumination. At 3 s into the video, the infrared light is turned off, whisker color signature disappear as well. When the light is turned on again, the whisker can be tracked again.

  • Video 4.

    Tracking forepaws movements. The painted limbs can be tracked for alternating up and down movement in real-time. The red trace is for the UP/Down movement of the right limb. The green traces is for the left limb. The treadmill position and velocity are also shown in the traces below.

  • Video 5.

    Tracking a single animal head rotation/direction and position in real time. Pixy camera tracks a multicolored piece of Styrofoam fixed on animal head-plate in regular light condition. The red traces on the top left shows the angle of head direction, while the blue traces in the middle left and green trace in bottom left shows the horizontal and vertical movement, respectively.

Back to top

In this issue

eneuro: 4 (1)
eNeuro
Vol. 4, Issue 1
January/February 2017
  • Table of Contents
  • Index by author
Email

Thank you for sharing this eNeuro article.

NOTE: We request your email address only to inform the recipient that it was you who recommended this article, and that it is not junk mail. We do not retain these email addresses.

Enter multiple addresses on separate lines or separate them with commas.
Pixying Behavior: A Versatile Real-Time and Post Hoc Automated Optical Tracking Method for Freely Moving and Head Fixed Animals
(Your Name) has forwarded a page to you from eNeuro
(Your Name) thought you would be interested in this article in eNeuro.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Print
View Full Page PDF
Citation Tools
Pixying Behavior: A Versatile Real-Time and Post Hoc Automated Optical Tracking Method for Freely Moving and Head Fixed Animals
Mostafa A. Nashaat, Hatem Oraby, Laura Blanco Peña, Sina Dominiak, Matthew E. Larkum, Robert N. S. Sachdev
eNeuro 14 February 2017, 4 (1) ENEURO.0245-16.2017; DOI: 10.1523/ENEURO.0245-16.2017

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Respond to this article
Share
Pixying Behavior: A Versatile Real-Time and Post Hoc Automated Optical Tracking Method for Freely Moving and Head Fixed Animals
Mostafa A. Nashaat, Hatem Oraby, Laura Blanco Peña, Sina Dominiak, Matthew E. Larkum, Robert N. S. Sachdev
eNeuro 14 February 2017, 4 (1) ENEURO.0245-16.2017; DOI: 10.1523/ENEURO.0245-16.2017
Twitter logo Facebook logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • Abstract
    • Significance Statement
    • Introduction
    • Materials and Methods
    • Results
    • Discussion
    • Acknowledgments
    • Footnotes
    • References
    • Synthesis
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF

Keywords

  • Behavioral Analysis
  • closed-loop behavior
  • head-fixed behavior
  • real-time behavior
  • sensorimotor integration
  • whisker kinematics

Responses to this article

Respond to this article

Jump to comment:

No eLetters have been published for this article.

Related Articles

Cited By...

More in this TOC Section

Methods/New Tools

  • Superficial Bound of the Depth Limit of Two-Photon Imaging in Mouse Brain
  • A Toolbox of Criteria for Distinguishing Cajal–Retzius Cells from Other Neuronal Types in the Postnatal Mouse Hippocampus
  • Assessment of Spontaneous Neuronal Activity In Vitro Using Multi-Well Multi-Electrode Arrays: Implications for Assay Development
Show more Methods/New Tools

Novel Tools and Methods

  • Superficial Bound of the Depth Limit of Two-Photon Imaging in Mouse Brain
  • A Toolbox of Criteria for Distinguishing Cajal–Retzius Cells from Other Neuronal Types in the Postnatal Mouse Hippocampus
  • Assessment of Spontaneous Neuronal Activity In Vitro Using Multi-Well Multi-Electrode Arrays: Implications for Assay Development
Show more Novel Tools and Methods

Subjects

  • Novel Tools and Methods
  • Home
  • Alerts
  • Follow SFN on BlueSky
  • Visit Society for Neuroscience on Facebook
  • Follow Society for Neuroscience on Twitter
  • Follow Society for Neuroscience on LinkedIn
  • Visit Society for Neuroscience on Youtube
  • Follow our RSS feeds

Content

  • Early Release
  • Current Issue
  • Latest Articles
  • Issue Archive
  • Blog
  • Browse by Topic

Information

  • For Authors
  • For the Media

About

  • About the Journal
  • Editorial Board
  • Privacy Notice
  • Contact
  • Feedback
(eNeuro logo)
(SfN logo)

Copyright © 2025 by the Society for Neuroscience.
eNeuro eISSN: 2373-2822

The ideas and opinions expressed in eNeuro do not necessarily reflect those of SfN or the eNeuro Editorial Board. Publication of an advertisement or other product mention in eNeuro should not be construed as an endorsement of the manufacturer’s claims. SfN does not assume any responsibility for any injury and/or damage to persons or property arising from or related to any use of any material contained in eNeuro.