Skip to main content

Main menu

  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Blog
    • Collections
    • Podcast
  • TOPICS
    • Cognition and Behavior
    • Development
    • Disorders of the Nervous System
    • History, Teaching and Public Awareness
    • Integrative Systems
    • Neuronal Excitability
    • Novel Tools and Methods
    • Sensory and Motor Systems
  • ALERTS
  • FOR AUTHORS
  • ABOUT
    • Overview
    • Editorial Board
    • For the Media
    • Privacy Policy
    • Contact Us
    • Feedback
  • SUBMIT

User menu

Search

  • Advanced search
eNeuro

eNeuro

Advanced Search

 

  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Blog
    • Collections
    • Podcast
  • TOPICS
    • Cognition and Behavior
    • Development
    • Disorders of the Nervous System
    • History, Teaching and Public Awareness
    • Integrative Systems
    • Neuronal Excitability
    • Novel Tools and Methods
    • Sensory and Motor Systems
  • ALERTS
  • FOR AUTHORS
  • ABOUT
    • Overview
    • Editorial Board
    • For the Media
    • Privacy Policy
    • Contact Us
    • Feedback
  • SUBMIT
PreviousNext
Research ArticleMethods/New Tools, Novel Tools and Methods

Fast, Flexible Closed-Loop Feedback: Tracking Movement in “Real-Millisecond-Time”

Keisuke Sehara, Viktor Bahr, Ben Mitchinson, Martin J. Pearson, Matthew E. Larkum and Robert N. S. Sachdev
eNeuro 14 October 2019, 6 (6) ENEURO.0147-19.2019; DOI: https://doi.org/10.1523/ENEURO.0147-19.2019
Keisuke Sehara
1Institute of Biology, Humboldt University of Berlin, D-10117 Berlin, Germany
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Keisuke Sehara
Viktor Bahr
2Eridian Systems, D-10179 Berlin, Germany
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Ben Mitchinson
3Department of Computer Science, University of Sheffield, Sheffield, S10 2TP United Kingdom
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Martin J. Pearson
4Bristol Robotics Laboratory, University of Bristol and University of the West of England, Bristol, BS16 1QY United Kingdom
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Martin J. Pearson
Matthew E. Larkum
1Institute of Biology, Humboldt University of Berlin, D-10117 Berlin, Germany
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Matthew E. Larkum
Robert N. S. Sachdev
1Institute of Biology, Humboldt University of Berlin, D-10117 Berlin, Germany
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Robert N. S. Sachdev
  • Article
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF
Loading

Article Figures & Data

Figures

  • Movies
  • Figure 1.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 1.

    Real-time whisker tracking and triggering with a neuromorphic camera. A, Overview of the FastEvent system. A whisker of an awake head-fixed animal was tracked using a DVS camera. to enhance contrast for post hoc and real-time trackings, the tracked whisker was painted with UV paint (magenta) and illuminated with a UV light. The behavior of the animal was recorded with the DVS camera, and a second color camera recording at 200–300 FPS was used as a reference. The data from the neuromorphic camera were sent to the PC in the form of a series of “events” and were then preconditioned. In the preconditioning step, the ROI was clipped, hot-pixels were removed, and noise was reduced with the aid of a graphical user interface, jAER (dotted-line box), that is supplied with the camera. The jAER was customized to estimate object position and generate triggers based on the location of an object (green boxes). Briefly, object positions were estimated by computing weighted averages of the events inside the ROI. The distance between the estimated whisker position and the “target region” (not shown for this figure) was monitored and transmitted to a custom-made independent service program, i.e., the “FastEventServer.” The FastEventServer drove TTL output from a designated board. B, The neuromorphic camera detects motion of the vibrissae. Single representative frames derived from different time points of behavior are shown from a high-speed camera (top panels). For images from a DVS camera (bottom panels), events that occurred within the ∼20-ms period were integrated and displayed. Scale bars = 5 mm.

  • Figure 2.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 2.

    Real-time whisker tracking in artificial settings and in awake mice. The motion of a plucked whisker attached to the arm of a servo motor sweeping at ∼0.1 to ∼12 Hz, was tracked using a high-speed camera post hoc tracking (blue) and was simultaneously tracked with the FastEvent real-time tracking system (orange) using different integration time constants. A, Representative angular positions from different sweep frequencies. Data are shown for 300-μs and 1-ms integration time constants. B, The radii of motion detected by both tracking methods. Error bars for the FastEvent system corresponding to the 2.5th and the 97.5th percentiles of the data. C, Effects of changing the integration time constants on the gain of the angular motion detected with the real-time system. The regression curves for 100-, 300-, and 1000-μs integration time constants are generated by fitting Butterworth filter characteristics to the observed values. D, Effects of integration time on RMS error. The moment-to-moment positional RMS error was averaged and plotted for individual sweep frequencies, and for each integration time constant. Once the real-time FastEvent system settles on a correct frequency of motion, it can track whisker position at ∼0.3-mm accuracy. E–H, Tracking of whisker motion from awake animals. E, Whisker-position traces were derived from a high-speed camera (blue) and the DVS neuromorphic camera (orange) in awake head-fixed mice. Whisker positions were recorded simultaneously with both cameras for 1 min and aligned post hoc. F, Wavelet (i.e., time-varying) power spectra. The offline high-speed camera and the neuromorphic camera tracked the whisker similarly, suggesting that the DVS camera tracked whisker dynamics precisely in real time. The power spectra were generated from the traces shown in E. G, Plot of the RMS error for the offline and real-time data. The medians and 2.5th and 97.5th percentiles across four sessions are shown. The black point above the plot represents the grand average (mean), and the error bars are the SDs, 0.674 ± 1.235 mm. H, Plot of the difference in phase for data analyzed offline and acquired in real time. The plot represents the mean ± SD in phase differences across the 1- to 30-Hz frequency range. Histograms from four sessions (two sessions each for two animals) were averaged. The black point and the gray error bars indicate the grand average and SDs, 20.0 ± 27.3°.

  • Figure 3.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 3.

    Real-time, low-latency triggers generated by whisker position. A, Schematic of the behavioral experiment. Whisking behavior of awake, freely-whisking head-fixed animals was recorded using the DVS camera. After processing on the host computer, TTL pulses were generated based on whisker position. These pulses were injected back into the auxiliary TTL port of the DVS camera (magenta), which in turn reported pulses on the same time base as the motion-related events. B, Comparison of internal trigger (blue) and actual trigger generation (magenta). Whisker positions were estimated in real time and are shown at the bottom (black, solid line). An arbitrary position threshold (blue, dotted line) was set for the session, and at any location above this position a TTL-high level was generated (region shaded blue). To estimate the round-trip latency, the timing of the output detected by the program (position evaluation, blue rectangles) and the timing of the TTL pulses detected at the DVS camera (triggers, orange rectangles) were compared post hoc. C, Timestamp-based estimation of round-trip latency, plotted as histograms. Latency distributions for ON events (the whisker going into the target, TTL low to high; top panel) and OFF events (going out of the target, TTL high to low; bottom panel) are examined separately. Most of the triggers were generated within 2.5 ms (mode of histograms, ∼1.6 ms). D, External recording-based estimation of trigger latency. To generate exactly timed motion, two LEDs flashed in alternations at 4 Hz (left, bottom). This flickering was captured by the DVS (left, top). Command signals to the LEDs (right, gray and black traces) and generated triggers from the DVS (right, blue traces) were recorded. E, Comparison of timestamp- and external recording-based estimation of trigger latency. Because of the latency to luminance event generation, timestamp-based estimation resulted in slightly shorter latencies (N = 502 ON events; timestamp-based, 1.19 ± 0.23 ms, external recording-based, 1.29 ± 0.17 ms).

  • Figure 4.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 4.

    Mice associate sensory feedback with whisking. A, Schematic of the setup. The mouse was head-fixed with one of its whiskers being dabbed with UV paint. A buzzer (brown) was placed around the setup, and a 568-nm LED (green) was set in the animal’s visual field. The water spout (gray) was positioned to deliver water reward and to monitor the animal’s lick behavior. A virtual target region (magenta) was placed around the sweeping region of the painted whisker. The position of the virtual target was adjusted during individual sessions using the FastEvent system. B, Behavioral task. The task is a multisensory association task, where the animal waits until it receives both auditory (brown) and visual (green) cues at the same time. The animal reports its sensation by a lick on the spout, which triggers reward delivery (2- to 5-μl water). C, Passive paradigm. In the passive paradigm, both auditory and visual cues were controlled by the task controller. In this paradigm, whisk events (trigger, magenta), generated when whisking behavior reached the target position, were monitored and recorded, but were not reported back to the animal. Sensory cues, trigger events and licks are shown in the raster display above the trace of whisker position. D, Active paradigm. In the active paradigm, the visual cue is generated by feedback of whisk events (trigger) when the painted whisker goes into the virtual target region. The auditory cue is controlled by the task controller. Whisker positions in C, D across sessions and trials were aligned by their base whisker position (the median position of the whisker during the initial 30 s of each session without engagement to any behavioral tasks). E, Profiles of threshold crossing (whisk) events based on whisker-position. Events that triggered an output event during successful trials are selected and trial-wise distributions of frequency of whisk events during the auditory-cued period (left) and the latency to the initial whisk events from the onset of the auditory cue (right) were plotted as cumulative histograms. For calculation of frequency, we first performed post hoc debouncing procedures so that whisk events had an interval of at least 20 ms. Only trials with more than two whisk events were used for this analysis. The average interevent intervals were inverted to compute the event frequency. Compared to those in the passive paradigm (black), trials in the active paradigm (blue) were found to have significantly more frequent threshold crossing events (p = 1.32 × 10−26, KS test; N = 236 passive vs 646 active trials from six to seven sessions each for two animals), with a lower latency to the auditory cue onset (p = 2.81 × 10−74, KS test; N = 337 passive vs 1479 active trials from six to seven sessions each for two animals). F, Latency of lick response from the onset of the visual cue during the auditory-cued period of successful trials. Cumulative histograms were generated for the active (blue) and the passive (black) paradigms. The latency was significantly smaller for trials in the active paradigm than those in the passive paradigm (p = 8.84 × 10−23, KS test; N = 337 passive vs 1479 active trials from six to seven sessions each for two animals).

  • Figure 5.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 5.

    Behavioral responses to changing target positions. A, Psychometric responses to changing target positions. The distribution of whisker positions in the initial 30-s period of the session is plotted as a histogram (gray). The distribution is from a period during the spontaneous whisking epoch, when there was no task. The success rates for mice positioning their whiskers in the course of the Active task, that required mice to position their whiskers to target locations, are plotted as lines (black in the foreground). Note that the virtual target regions were changed randomly after every 6–12 trials. The same target regions could appear multiple times intermittently. The x-axis for the black lines, the plot of success rates, corresponds to the target position. The dotted lines represent the target positions used in B–D, with the number on the line indicating the corresponding target position. B, The third, fourth, and sixth successful trials of whisking behavior when the target position was varied during a single session are plotted here. C, Distribution of whisker positions during different behavioral phases of each trial. A histogram is generated for each representative trial shown in B. The three behavioral phases include those before the animal whisks to the target (Wait), in approach to the target (Hit), and after obtaining reward (Lick). D, Histograms of whisker distributions averaged across trials throughout the single session. The mode of the distributions of whisker position as the animal approached the target (Hit) correlated with the virtual target positions.

  • Figure 6.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 6.

    Strategy of whisker positioning with respect to changing target positions. A, Computation of upper and lower bounds of whisker positions during each successful trial. Lower and upper bounds of high-speed-tracked whisker positions were computed using a 100-ms radius sliding window. B, Linear regression of the upper and lower bound related to changes in target position. Each dot corresponds to a target position and is the average value of lower or upper bounds during each of the behavioral phases. There was a shift in the envelope of whisker positions as the target position moved further away. C, Linear regression of whisking amplitude with respect to changing target positions. Data are from the same session as in B and are plotted similarly. Whisking amplitude was computed as the difference between lower and upper bounds. D, Slopes of linear regression of set points and whisking amplitudes during different behavioral epochs and phases. Data are from 11 behavioral sessions in two animals. A Mann–Whitney U test was performed to examine whether the set of slopes for each trial phase were significantly larger than zero (symbols at the bottom; ***p < 0.001, **p < 0.01, *p < 0.05). Wilcoxon signed-rank test with Holm–Bonferroni correction was used to compare values between pairs of different trial phases (symbols at the top; *p < 0.05, NS, p > 0.05). E, Multivariate linear regression analysis. The variance of the actual whisker positions (black) explained the variability of target positions more effectively than the variance of the shuffled target positions (gray; symbols at the bottom; **p < 0.01, Wilcoxon signed-rank test). The variance in Hit phase explained a significantly larger portion of the variability in target positions than during the Wait phase (symbols at the top; *p < 0.05, NS, p > 0.05, Wilcoxon signed-rank test with Holm–Bonferroni correction). F, Comparison of regression coefficients for set points (black) and amplitudes (gray). Variability of set points were found to explain target variability more than variability of amplitudes do, during the Wait and Hit phases (symbols at the bottom; *p < 0.05, NS, p > 0.05, Wilcoxon signed-rank test). Variability of set points reflected target variability more during the Hit phase than during the Wait or the Lick phases (black symbols at the top; *p < 0.05, Wilcoxon signed-rank test with Holm–Bonferroni correction). There was a tendency for amplitude values to increase as the trial proceeded from the Wait through the Hit to the Lick phases, but there was no significant difference between them (gray symbols at the top; NS, p > 0.05, Wilcoxon signed-rank test with Holm–Bonferroni correction). Refer to the main text for the exact R 2 and p values.

Movies

  • Figures
  • Movie 1.

    Real-time tracking of whisker positions of a freely whisking mouse. The setup is as described in Figure 2. During the ∼3-min session, the animal was head-fixed and allowed to whisk freely under a high-speed (270 FPS) and a neuromorphic (DVS) cameras. The crosses indicate the estimate of whisker position, the dotted line indicates the virtual target position. Changes in colors of signs from white to magenta shows the TTL signal generated from the FastEvent system, as it was fed back to the DVS camera (i.e., the timing of real-time feedback). Information from the DVS camera was aligned and then down sampled to annotate the high-speed video. A 5-s period was used to generate the video.

Back to top

In this issue

eneuro: 6 (6)
eNeuro
Vol. 6, Issue 6
November/December 2019
  • Table of Contents
  • Index by author
  • Ed Board (PDF)
Email

Thank you for sharing this eNeuro article.

NOTE: We request your email address only to inform the recipient that it was you who recommended this article, and that it is not junk mail. We do not retain these email addresses.

Enter multiple addresses on separate lines or separate them with commas.
Fast, Flexible Closed-Loop Feedback: Tracking Movement in “Real-Millisecond-Time”
(Your Name) has forwarded a page to you from eNeuro
(Your Name) thought you would be interested in this article in eNeuro.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Print
View Full Page PDF
Citation Tools
Fast, Flexible Closed-Loop Feedback: Tracking Movement in “Real-Millisecond-Time”
Keisuke Sehara, Viktor Bahr, Ben Mitchinson, Martin J. Pearson, Matthew E. Larkum, Robert N. S. Sachdev
eNeuro 14 October 2019, 6 (6) ENEURO.0147-19.2019; DOI: 10.1523/ENEURO.0147-19.2019

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Respond to this article
Share
Fast, Flexible Closed-Loop Feedback: Tracking Movement in “Real-Millisecond-Time”
Keisuke Sehara, Viktor Bahr, Ben Mitchinson, Martin J. Pearson, Matthew E. Larkum, Robert N. S. Sachdev
eNeuro 14 October 2019, 6 (6) ENEURO.0147-19.2019; DOI: 10.1523/ENEURO.0147-19.2019
del.icio.us logo Digg logo Reddit logo Twitter logo Facebook logo Google logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • Abstract
    • Significance Statement
    • Introduction
    • Materials and Methods
    • Results
    • Discussion
    • Acknowledgments
    • Footnotes
    • References
    • Synthesis
    • Author Response
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF

Keywords

  • feedback
  • kinematics
  • Motor
  • neuro-morphic
  • somatosensory
  • virtual reality

Responses to this article

Respond to this article

Jump to comment:

No eLetters have been published for this article.

Related Articles

Cited By...

More in this TOC Section

Methods/New Tools

  • Superficial Bound of the Depth Limit of Two-Photon Imaging in Mouse Brain
  • A Toolbox of Criteria for Distinguishing Cajal–Retzius Cells from Other Neuronal Types in the Postnatal Mouse Hippocampus
  • Assessment of Spontaneous Neuronal Activity In Vitro Using Multi-Well Multi-Electrode Arrays: Implications for Assay Development
Show more Methods/New Tools

Novel Tools and Methods

  • Superficial Bound of the Depth Limit of Two-Photon Imaging in Mouse Brain
  • A Toolbox of Criteria for Distinguishing Cajal–Retzius Cells from Other Neuronal Types in the Postnatal Mouse Hippocampus
  • Assessment of Spontaneous Neuronal Activity In Vitro Using Multi-Well Multi-Electrode Arrays: Implications for Assay Development
Show more Novel Tools and Methods

Subjects

  • Novel Tools and Methods

  • Home
  • Alerts
  • Visit Society for Neuroscience on Facebook
  • Follow Society for Neuroscience on Twitter
  • Follow Society for Neuroscience on LinkedIn
  • Visit Society for Neuroscience on Youtube
  • Follow our RSS feeds

Content

  • Early Release
  • Current Issue
  • Latest Articles
  • Issue Archive
  • Blog
  • Browse by Topic

Information

  • For Authors
  • For the Media

About

  • About the Journal
  • Editorial Board
  • Privacy Policy
  • Contact
  • Feedback
(eNeuro logo)
(SfN logo)

Copyright © 2023 by the Society for Neuroscience.
eNeuro eISSN: 2373-2822

The ideas and opinions expressed in eNeuro do not necessarily reflect those of SfN or the eNeuro Editorial Board. Publication of an advertisement or other product mention in eNeuro should not be construed as an endorsement of the manufacturer’s claims. SfN does not assume any responsibility for any injury and/or damage to persons or property arising from or related to any use of any material contained in eNeuro.