Skip to main content

Main menu

  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Blog
    • Collections
    • Podcast
  • TOPICS
    • Cognition and Behavior
    • Development
    • Disorders of the Nervous System
    • History, Teaching and Public Awareness
    • Integrative Systems
    • Neuronal Excitability
    • Novel Tools and Methods
    • Sensory and Motor Systems
  • ALERTS
  • FOR AUTHORS
  • ABOUT
    • Overview
    • Editorial Board
    • For the Media
    • Privacy Policy
    • Contact Us
    • Feedback
  • SUBMIT

User menu

Search

  • Advanced search
eNeuro
eNeuro

Advanced Search

 

  • HOME
  • CONTENT
    • Early Release
    • Featured
    • Current Issue
    • Issue Archive
    • Blog
    • Collections
    • Podcast
  • TOPICS
    • Cognition and Behavior
    • Development
    • Disorders of the Nervous System
    • History, Teaching and Public Awareness
    • Integrative Systems
    • Neuronal Excitability
    • Novel Tools and Methods
    • Sensory and Motor Systems
  • ALERTS
  • FOR AUTHORS
  • ABOUT
    • Overview
    • Editorial Board
    • For the Media
    • Privacy Policy
    • Contact Us
    • Feedback
  • SUBMIT
PreviousNext
Research ArticleOpen Source Tools and Methods, Novel Tools and Methods

An Open Source Platform for Presenting Dynamic Visual Stimuli

Kyra Swanson, Samantha R. White, Michael W. Preston, Joshua Wilson, Meagan Mitchell and Mark Laubach
eNeuro 2 April 2021, 8 (3) ENEURO.0563-20.2021; https://doi.org/10.1523/ENEURO.0563-20.2021
Kyra Swanson
Department of Neuroscience, American University, Washington, DC 20016
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Samantha R. White
Department of Neuroscience, American University, Washington, DC 20016
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Michael W. Preston
Department of Neuroscience, American University, Washington, DC 20016
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Joshua Wilson
Department of Neuroscience, American University, Washington, DC 20016
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Meagan Mitchell
Department of Neuroscience, American University, Washington, DC 20016
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Mark Laubach
Department of Neuroscience, American University, Washington, DC 20016
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF
Loading

Article Figures & Data

Figures

  • Tables
  • Extended Data
  • Figure 1.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 1.

    Hardware and wiring for the LED matrices. The required wiring is shown for connecting the Arduino microcontroller (upper left) to the LED matrices (lower row) and the external control system that selects stimulus patterns based on TTL serial inputs. The version reported here allows for stimulus control by four distinct TTL inputs. Connectivity with the external control system can easily be arranged using a standard breakout board (upper right). Pins on the backpack controller for LED matrices are shown above the LED matrices.

  • Figure 2.
    • Download figure
    • Open in new tab
    • Download powerpoint
    Figure 2.

    Visual discrimination behavior using the LED matrices. A, Rats were trained and tested in an operant chamber with the LED matrices mounted above response ports on one side of the chamber and with a reward port containing a spout mounted on the opposite side of the chamber. Each trial was initiated by a nosepoke response in the center port to the “trial start” cue, followed by the presentation of either the high-luminance or low-luminance cue. Both cues were presented for discrimination trials. B, In detection trials, rats expressed a greater number of errors when detecting the low-luminance cue compared with the high-luminance cue. In discrimination trials, rats chose the high-luminance cue more often than the low-luminance cue. C, Rats responded more slowly on discrimination trials compared with detection trials. D, Reduced luminance testing. The test session started with one filter layer over the LED matrices and rats had to report the low-luminance cue on the left or right nosepoke port. After approximately every 30 trials, additional filter layers were added for a total of five layers. E, As each layer was added, the overall error percentage increased. F, Response latencies were stable over the level of luminance; *p < 0.05, **p < 0.01, ***p < 0.001. Error bars represent 95% confidence intervals.

Tables

  • Figures
  • Extended Data
    • View popup
    Table 1

    Luminance levels for the LED matrices with 1–16 pixels illuminated

    Number of LED pixels illuminatedPeak luminance (in μW)
    16.2
    29.3
    418.5
    830.2
    1658.0
    • View popup
    Table 2

    Luminance levels for two illuminated pixels with 0–5 filters placed over the LED matrices

    Number of filters over two LED pixelsPeak luminance (in μW)
    09.3
    12.5
    20.7
    30.2
    40.1
    5< 0.1
    • View popup
    Table 3

    Parts list

    ItemQuantitySource, useful resources
    Arduino Uno1https://store.arduino.cc/usa/arduino-uno-rev3
    1.2” 8 × 8 LED matrix with backpack3https://www.adafruit.com/product/870
    https://learn.adafruit.com/adafruit-led-backpack/1-2-8x8-matrix
    Half Bread Board1https://www.adafruit.com/product/4539
    10k Ohm Resistor12https://www.adafruit.com/product/2784
    4 strand 22-g shielded hookup wire∼5 feethttps://www.amazon.com/Gauge-Conductor-Stranded-Copper-Security/dp/B01CT8IUIO
    22-g solid wire [4–5 colors (Rd,Bk,C1,C2,C3)]∼2 feet per colorhttps://www.amazon.com/TUOFENG-Wire-Solid-different-colored-spools/dp/B07TX6BX47
    5-Way Wire Connectors4https://www.amazon.com/Wago-221-415-LEVER-NUTS-Conductor-Connectors/dp/B06XH47DC2/
    3D-printed holder3GitHub, Extended Data 1
    #4–40 machine screws, 8-mm total length6https://www.digikey.com/en/products/detail/keystone-electronics/9900/317321
    Adafruit_GFX Libraryhttps://learn.adafruit.com/adafruit-gfx-graphics-library
    Adafruit_LEDBackpackhttps://learn.adafruit.com/adafruit-led-backpack/downloads
    Stimulus codeGitHub, Extended Data 1
    Optional
    3D-Printed Nosepoke Ports3GitHub, Extended Data 1
    IR Beam Breaker Sensor3https://www.adafruit.com/product/2167
    #6–32 machine screws, 11-mm total length12https://www.digikey.com/en/products/detail/keystone-electronics/9904/317325
    Dupont Jumper Connectors (connect Arduino to LED backpack)https://www.amazon.com/HJ-Garden-Connectors-Terminal-1-12Pin/dp/B07BDJ63CP

Extended Data

  • Figures
  • Tables
  • Extended Data 1

    • README.txt, overview on files in the Extended Data 1.

    • StimCodes.png, documentation on producing the visual patterns over three LED matrices as described in this manuscript.

    • LEDMatrix-Addressing.png, documentation on addressing for the LED Backpacks.

    • WiringDiagram.png, image showing specific wiring for the LED Backpacks, Arduno, and breadboard.

    • FullSetUp.png, image showing wiring diagram for LED Backpacks, Arduino, breadboard, and relays to integrate with MedPC control system.

    • nosepoke.stl and nosepoke-block.stl, design files for the optional nosepoke ports.

    • LED-Matrix-Holder-Part1.stl and LED-Matrix-Holder-Part2.stl, design files for the 3D-printed holders for the LED matrices.

    • VisualStimuli-Arduino-eNeuro.ino, Arduino code for the device.

    The following files are included in the Extended Data, which can be found at https://github.com/LaubachLab/LED-matrices: Download Extended Data 1, ZIP file.

Back to top

In this issue

eneuro: 8 (3)
eNeuro
Vol. 8, Issue 3
May/June 2021
  • Table of Contents
  • Index by author
  • Ed Board (PDF)
Email

Thank you for sharing this eNeuro article.

NOTE: We request your email address only to inform the recipient that it was you who recommended this article, and that it is not junk mail. We do not retain these email addresses.

Enter multiple addresses on separate lines or separate them with commas.
An Open Source Platform for Presenting Dynamic Visual Stimuli
(Your Name) has forwarded a page to you from eNeuro
(Your Name) thought you would be interested in this article in eNeuro.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Print
View Full Page PDF
Citation Tools
An Open Source Platform for Presenting Dynamic Visual Stimuli
Kyra Swanson, Samantha R. White, Michael W. Preston, Joshua Wilson, Meagan Mitchell, Mark Laubach
eNeuro 2 April 2021, 8 (3) ENEURO.0563-20.2021; DOI: 10.1523/ENEURO.0563-20.2021

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Respond to this article
Share
An Open Source Platform for Presenting Dynamic Visual Stimuli
Kyra Swanson, Samantha R. White, Michael W. Preston, Joshua Wilson, Meagan Mitchell, Mark Laubach
eNeuro 2 April 2021, 8 (3) ENEURO.0563-20.2021; DOI: 10.1523/ENEURO.0563-20.2021
Twitter logo Facebook logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • Abstract
    • Significance Statement
    • Introduction
    • Materials and Methods
    • Results
    • Discussion
    • Acknowledgments
    • Footnotes
    • References
    • Synthesis
    • Author Response
  • Figures & Data
  • Info & Metrics
  • eLetters
  • PDF

Keywords

  • Arduino
  • decision making
  • discrimination
  • open source
  • psychophysics
  • visual

Responses to this article

Respond to this article

Jump to comment:

No eLetters have been published for this article.

Related Articles

Cited By...

More in this TOC Section

Open Source Tools and Methods

  • RetINaBox: A Hands-On Learning Tool for Experimental Neuroscience
  • The Odor Delivery Optimization Research System (ODORS): An Open-Source Olfactometer for Behavioral Assessments in Tethered and Untethered Rodents
  • Low-Cost 3D-Printed Mazes with Open-Source ML Tracking for Mouse Behavior
Show more Open Source Tools and Methods

Novel Tools and Methods

  • RetINaBox: A Hands-On Learning Tool for Experimental Neuroscience
  • The Odor Delivery Optimization Research System (ODORS): An Open-Source Olfactometer for Behavioral Assessments in Tethered and Untethered Rodents
  • Low-Cost 3D-Printed Mazes with Open-Source ML Tracking for Mouse Behavior
Show more Novel Tools and Methods

Subjects

  • Novel Tools and Methods
  • Open Source Tools and Methods
  • Home
  • Alerts
  • Follow SFN on BlueSky
  • Visit Society for Neuroscience on Facebook
  • Follow Society for Neuroscience on Twitter
  • Follow Society for Neuroscience on LinkedIn
  • Visit Society for Neuroscience on Youtube
  • Follow our RSS feeds

Content

  • Early Release
  • Current Issue
  • Latest Articles
  • Issue Archive
  • Blog
  • Browse by Topic

Information

  • For Authors
  • For the Media

About

  • About the Journal
  • Editorial Board
  • Privacy Notice
  • Contact
  • Feedback
(eNeuro logo)
(SfN logo)

Copyright © 2026 by the Society for Neuroscience.
eNeuro eISSN: 2373-2822

The ideas and opinions expressed in eNeuro do not necessarily reflect those of SfN or the eNeuro Editorial Board. Publication of an advertisement or other product mention in eNeuro should not be construed as an endorsement of the manufacturer’s claims. SfN does not assume any responsibility for any injury and/or damage to persons or property arising from or related to any use of any material contained in eNeuro.