DeepLabCut: markerless pose estimation of user-defined body parts with deep learning

Nat Neurosci. 2018 Sep;21(9):1281-1289. doi: 10.1038/s41593-018-0209-y. Epub 2018 Aug 20.

Abstract

Quantifying behavior is crucial for many applications in neuroscience. Videography provides easy methods for the observation and recording of animal behavior in diverse settings, yet extracting particular aspects of a behavior for further analysis can be highly time consuming. In motor control studies, humans or other animals are often marked with reflective markers to assist with computer-based tracking, but markers are intrusive, and the number and location of the markers must be determined a priori. Here we present an efficient method for markerless pose estimation based on transfer learning with deep neural networks that achieves excellent results with minimal training data. We demonstrate the versatility of this framework by tracking various body parts in multiple species across a broad collection of behaviors. Remarkably, even when only a small number of frames are labeled (~200), the algorithm achieves excellent tracking performance on test frames that is comparable to human accuracy.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms
  • Animals
  • Behavior*
  • Behavior, Animal*
  • Deep Learning*
  • Drosophila melanogaster
  • Humans
  • Male
  • Mice
  • Mice, Inbred C57BL
  • Nerve Net / physiology
  • Neural Networks, Computer
  • Odorants
  • Posture
  • Psychomotor Performance / physiology
  • Transfer, Psychology
  • Video Recording / methods*