Skip to main content

Main menu

  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Blog
    • Collections
    • Podcast
  • TOPICS
    • Cognition and Behavior
    • Development
    • Disorders of the Nervous System
    • History, Teaching and Public Awareness
    • Integrative Systems
    • Neuronal Excitability
    • Novel Tools and Methods
    • Sensory and Motor Systems
  • ALERTS
  • FOR AUTHORS
  • ABOUT
    • Overview
    • Editorial Board
    • For the Media
    • Privacy Policy
    • Contact Us
    • Feedback
  • SUBMIT

User menu

Search

  • Advanced search
eNeuro

eNeuro

Advanced Search

 

  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Blog
    • Collections
    • Podcast
  • TOPICS
    • Cognition and Behavior
    • Development
    • Disorders of the Nervous System
    • History, Teaching and Public Awareness
    • Integrative Systems
    • Neuronal Excitability
    • Novel Tools and Methods
    • Sensory and Motor Systems
  • ALERTS
  • FOR AUTHORS
  • ABOUT
    • Overview
    • Editorial Board
    • For the Media
    • Privacy Policy
    • Contact Us
    • Feedback
  • SUBMIT
PreviousNext
Research ArticleOpen Source Tools and Methods, Novel Tools and Methods

neurotic: Neuroscience Tool for Interactive Characterization

Jeffrey P. Gill, Samuel Garcia, Lena H. Ting, Mengnan Wu and Hillel J. Chiel
eNeuro 24 April 2020, 7 (3) ENEURO.0085-20.2020; DOI: https://doi.org/10.1523/ENEURO.0085-20.2020
Jeffrey P. Gill
1Department of Biology, Case Western Reserve University, Cleveland, OH 44106-7080
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Jeffrey P. Gill
Samuel Garcia
2Centre de Recherche en Neuroscience de Lyon, Centre National de la Recherche Scientifique Unité Mixte de Recherche 5292– Institut National de la Santé et de la Recherche Médicale U1028–Université Claude Bernard Lyon 1, Lyon, 69675 Bron CEDEX, France
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Lena H. Ting
3The Wallace H. Coulter Department of Biomedical Engineering, Emory University and Georgia Tech, Department of Rehabilitation Medicine, Division of Physical Therapy, Emory University School of Medicine, Atlanta, GA 30322-1119
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Lena H. Ting
Mengnan Wu
4The Wallace H. Coulter Department of Biomedical Engineering, Emory University and Georgia Tech, Atlanta, GA 30322-1119
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Mengnan Wu
Hillel J. Chiel
5Departments of Biology, Neurosciences, and Biomedical Engineering, Case Western Reserve University, Cleveland, OH 44106-7080
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Hillel J. Chiel
  • Article
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF
Loading

Article Figures & Data

Figures

  • Tables
  • Figure 1.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 1.

    GUI. The GUI of neurotic combines data from multiple sources, such as electrophysiology data files and behavioral video files, into a unified and synchronized view for reviewing and annotating experimental data. This example shows data collected as an animal (Aplysia californica) ate seaweed (see Materials and Methods, Experimental methods). Implanted electrodes captured extracellular signals from a muscle and three nerves associated with feeding. Video of behavior and swallowing force from food attached to a force transducer were also recorded. A, The interface has movable and resizable panels created using the Python package ephyviewer (http://pypi.org/project/ephyviewer). Across the top of the interface, a time navigation bar allows a user to jump through the record or play it at different speeds (keyboard shortcuts provide similar features). The “time width” box controls the displayed duration. Muscle, nerve, and force signals are displayed (top left panel) along with the video (top right panel). A vertical line indicates the time corresponding to the displayed video frame. Panels can be zoomed using the mouse, and time can be expanded or contracted using the mouse or the time width input. User-defined events are listed (bottom right panel), providing bookmarks to different parts of the experiment. A tool called the epoch encoder (bottom left panel) allows users to label time periods (controls are collapsed here; see D for details). Some panels can be double-clicked to open dialogue windows with additional options (e.g., to hide plotted signals). B, Before displaying a user interface like that in A, neurotic provides a window for selecting datasets. Users load a dataset list from a manually created human-readable YAML configuration file that provides paths to data files and other parameters (Fig. 2). If data and video files are available on the internet, neurotic can be configured to locate and download files if they are not already available locally. Icons to the left of each dataset name indicate whether files are already available locally or not. This window provides menus for loading options and color themes. The most important of these is the toggleable fast loading option, which is available for a subset of data file types and provides much faster loading with less memory consumption in exchange for skipping data processing steps like filtering and spike detection. C, Using optional parameters provided in the configuration file (Fig. 2G), peaks can be detected on signals using amplitude-threshold window discriminators (see Fig. 3 for algorithm). Peaks identified by the same discriminator are grouped as a unit and plotted both as points on the signals and as raster plots in a separate panel (hidden in A but shown in this panel). D, The epoch encoder panel displays horizontal bars to represent user-created, labeled time periods. Periods of interest, such as a particular behavior, the phase of a rhythmic behavior, or the activity of an identified neuron may be labeled using the epoch encoder. These data are saved to a plain text spreadsheet format (CSV) that may be used for further analysis. The epoch encoder panel allows precise control of the temporal parameters of epochs, merging of adjacent epochs, and filling gaps between epochs. Epochs can be placed using the range selection feature (magenta rectangle) with the mouse by dragging its edges. Keyboard shortcuts exist for rapid encoding. A table lists the existing epochs and allows them to be modified or removed.

  • Figure 2.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 2.

    An example configuration file. To visualize experimental data with neurotic, datasets must be listed and described within human-readable plain text YAML configuration files. Multiple configurations for the same or different datasets may be listed within one file and are separated into blocks using blank lines and indentation. This example shows one configuration that is similar to that used to create Figures 1, 3. Each set of parameters shown is optional. A, Each configuration is given a unique name, e.g., “feeding experiment.” Details associated with the configuration are indented beneath the name. A brief description of the dataset may be provided. The names and descriptions of datasets are displayed when a YAML file is loaded, giving the user the option to select the dataset to view (Fig. 1B). B, Paths to related files found locally on the computer, e.g., signals and video from a single experiment, may be located within a single directory specified by data_dir (such a file tree structure is convenient but not necessary as neurotic can accept relative or absolute paths). A file containing signal data that is readable by the Python package Neo (Garcia et al., 2014) is specified using data_file. Neo supports many file types (see list: https://neo.readthedocs.io/en/latest/io.html#module-neo.io). Signals are read from the data file, processed according to other, optional configuration settings, and plotted. For example, in this figure, sections D, E, G affect plotting. C, A video file may be associated with the signal data. Synchronization parameters can be given for controlling the video and signal data alignment. For example, in this case, video capture began 2.51 s before signal data acquisition, so −2.51 is provided for video_offset to shift the video start time. Parameters for correcting for frame rate inaccuracies or clipping the video are also available. D, All signals or a subset of signals will be plotted according to the parameters given under plots, which control plot range and labeling (Fig. 1A, top left panel). E, Signals may be filtered before plotting. F, An optional epoch encoder GUI panel (Fig. 1D) creates annotations that may be saved to a spreadsheet (CSV) file. The epoch encoder allows user-defined labels to be attached to time periods. G, Peaks may be identified in the signals using amplitude thresholds. Each amplitude discriminator is applied to the specified signal channel and may be constrained to periods marked with a particular epoch label. This creates one spike train for each amplitude discriminator, plotted both as points on the signals and as a raster plot (Fig. 1C). See Figure 3 for algorithm details. H, Bursts of activity in spike trains may be detected using initiation and termination firing frequencies.

  • Figure 3.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 3.

    Peak detection using a window discriminator to define neuronal bursts. Panels show an example of applying three amplitude discriminators to a single channel to obtain three sets of peaks, classified as action potentials originating from identified motor neurons B38 (McManus et al., 2014), B6/B9, and B3 (Lu et al., 2015). The configuration settings are shown in Figure 2G. A, The raw signal obtained from an extracellular whole nerve recording of BN2 is plotted and contains activity from motor and sensory neurons. Identified neurons are distinguishable by their relative amplitude and timing within the feeding motor pattern (Lu et al., 2013). B, Epochs labeled “B38 activity” and “B3/B6/B9 activity” were placed manually using the epoch encoder (Fig. 1D) based on prior work. Amplitude windows for spike classes were estimated manually and specified in the configuration file. C, At load time, using algorithms from the Python package elephant (http://pypi.org/project/elephant), local maxima or minima were located within the given amplitude range and contained within a period with an appropriate epoch label. These points are displayed on the signals and as raster plots (Fig. 1C). D, Subsets of each spike train were grouped into bursts defined by firing frequency thresholds (boxes). Notice that the initial B38 spike and the final B3 spike were excluded from the bursts because they did not meet the firing frequency conditions. Burst times may then be used for further analysis.

  • Figure 4.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 4.

    Using neurotic to explore an adaptive response to load during feeding. A, As an animal swallowed a strip of unloaded (unanchored) seaweed, activity of identified feeding motor neurons B6/B9 and B3 was identified from the extracellular nerve recording of BN2. Using the synchronized video and epoch encoder, the timing of inward movement of the strip and the approximate period during which the identified neurons fired were labeled (gray and blue bars, respectively). Using the peak detection feature, medium-sized spikes were classified as B6/B9, and large spikes were classified as B3, which are major motor neurons known to contribute substantially to swallowing force (Lu et al., 2015). Spikes are plotted as points and in raster plots. B, In a separate trial, the same animal fed on a strip of unbreakable seaweed anchored to a force transducer. The force generated by the animal is plotted with the extracellular nerve signals. In response to the increased load, the animal activated the identified motor neurons at high frequency for longer, leading to greater force. C, Using firing frequency criteria determined by prior work (Lu et al., 2015), all spikes generated when feeding on an unloaded seaweed strip associated with the B6/B9 motor neurons (light blue) were grouped into a burst. The two spikes generated by B3 (dark blue) did not meet the criterion. D, Using the same criteria when the animal fed on a loaded strip, all B6/B9 spikes and all but one B3 spike were grouped into bursts. The data suggest that the B3 neuron is more likely to activate at high frequency when the animal encounters load, and the B6/B9 neurons are active for a longer duration (Gill and Chiel, 2020).

  • Figure 5.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 5.

    Using neurotic to explore balance beam walking EMG and kinematics. The vertical line corresponds to the time of the video frame shown where the participant lifts her left arm and extends it at the elbow as she regains balance while walking along a narrow beam (1.8 cm wide, 3.66 m long). Time-synched EMG signals show that the left triceps is activated before this event (second trace), consistent with extending the elbow, and then is followed by activation of the biceps (first trace), consistent with braking elbow extension. The z-coordinates (elevation) of the markers at the center of the clavicles and on each heel are also plotted. At the time of the illustrated video frame, the dip in clavicle elevation (fifth trace) is associated with the lowering of the participant’s right side of the torso as part of the balancing response. The left heel marker (sixth trace) shows that the left leg was nearing the ground just before being placed on the beam. The right heel marker (seventh trace) shows a slow rise as the participant began to roll forward on the ball of her right foot.

Tables

  • Figures
    • View popup
    Table 1

    Important links

    ResourceURL
    Source code https://github.com/jpgill86/neurotic
    Documentation https://neurotic.readthedocs.io
    PyPI distribution https://pypi.org/project/neurotic
    Anaconda distribution https://anaconda.org/conda-forge/neurotic
    Standalone installers https://github.com/jpgill86/neurotic/releases
    Issue tracker https://github.com/jpgill86/neurotic/issues
Back to top

In this issue

eneuro: 7 (3)
eNeuro
Vol. 7, Issue 3
Month/Month
  • Table of Contents
  • Index by author
  • Ed Board (PDF)
Email

Thank you for sharing this eNeuro article.

NOTE: We request your email address only to inform the recipient that it was you who recommended this article, and that it is not junk mail. We do not retain these email addresses.

Enter multiple addresses on separate lines or separate them with commas.
neurotic: Neuroscience Tool for Interactive Characterization
(Your Name) has forwarded a page to you from eNeuro
(Your Name) thought you would be interested in this article in eNeuro.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Print
View Full Page PDF
Citation Tools
neurotic: Neuroscience Tool for Interactive Characterization
Jeffrey P. Gill, Samuel Garcia, Lena H. Ting, Mengnan Wu, Hillel J. Chiel
eNeuro 24 April 2020, 7 (3) ENEURO.0085-20.2020; DOI: 10.1523/ENEURO.0085-20.2020

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Respond to this article
Share
neurotic: Neuroscience Tool for Interactive Characterization
Jeffrey P. Gill, Samuel Garcia, Lena H. Ting, Mengnan Wu, Hillel J. Chiel
eNeuro 24 April 2020, 7 (3) ENEURO.0085-20.2020; DOI: 10.1523/ENEURO.0085-20.2020
del.icio.us logo Digg logo Reddit logo Twitter logo Facebook logo Google logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • Abstract
    • Significance Statement
    • Introduction
    • Materials and Methods
    • Results
    • Discussion
    • Acknowledgments
    • Footnotes
    • References
    • Synthesis
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF

Keywords

  • behavioral neuroscience
  • data sharing
  • open source
  • Python
  • video synchronization
  • visualization

Responses to this article

Respond to this article

Jump to comment:

No eLetters have been published for this article.

Related Articles

Cited By...

More in this TOC Section

Open Source Tools and Methods

  • Customizable Open-Source Rotating Rod (Rotarod) Enables Robust Low-Cost Assessment of Motor Performance in Mice
  • A Simple, Lightweight, and Low-Cost Customizable Multielectrode Array for Local Field Potential Recordings
  • High-Sensitivity Intrinsic Optical Signal Imaging Through Flexible, Low-Cost Adaptations of an Upright Microscope
Show more Open Source Tools and Methods

Novel Tools and Methods

  • Customizable Open-Source Rotating Rod (Rotarod) Enables Robust Low-Cost Assessment of Motor Performance in Mice
  • A Simple, Lightweight, and Low-Cost Customizable Multielectrode Array for Local Field Potential Recordings
  • High-Sensitivity Intrinsic Optical Signal Imaging Through Flexible, Low-Cost Adaptations of an Upright Microscope
Show more Novel Tools and Methods

Subjects

  • Novel Tools and Methods
  • Open Source Tools and Methods

  • Home
  • Alerts
  • Visit Society for Neuroscience on Facebook
  • Follow Society for Neuroscience on Twitter
  • Follow Society for Neuroscience on LinkedIn
  • Visit Society for Neuroscience on Youtube
  • Follow our RSS feeds

Content

  • Early Release
  • Current Issue
  • Latest Articles
  • Issue Archive
  • Blog
  • Browse by Topic

Information

  • For Authors
  • For the Media

About

  • About the Journal
  • Editorial Board
  • Privacy Policy
  • Contact
  • Feedback
(eNeuro logo)
(SfN logo)

Copyright © 2023 by the Society for Neuroscience.
eNeuro eISSN: 2373-2822

The ideas and opinions expressed in eNeuro do not necessarily reflect those of SfN or the eNeuro Editorial Board. Publication of an advertisement or other product mention in eNeuro should not be construed as an endorsement of the manufacturer’s claims. SfN does not assume any responsibility for any injury and/or damage to persons or property arising from or related to any use of any material contained in eNeuro.