Simbrain 3.0: A flexible, visually-oriented neural network simulator
Introduction
Simbrain 3.0 (http://www.simbrain.net/), a major revision and overhaul of Simbrain 2.0 (Yoshimi, 2008), is an open source neural network tool1 which can simulate many different network architectures using a visually rich and intuitive interface. This makes it ideal for education and rapid prototyping. Simbrain 3.0 has greatly improved performance: it can handle thousands of model neurons and hundreds of thousands of synaptic connections on a standard desktop PC. Thus Simbrain is increasingly viable as a research tool, particularly in computational neuroscience.
Simbrain’s flexibility is based on its modular “tool-kit” structure and scripting abilities. In the last few decades, artificial neural networks (ANNs) have become prominent in many fields, including machine learning, neuroscience, and psychology. The thousands of models that have been developed in these fields are largely based on a common core of basic components and functions. Simbrain supports most of these core components as graphical elements and allows them to be combined in arbitrary ways. Thus many existing models can be developed “out of the box” using Simbrain’s graphical user interface (GUI). The intention is to create a sandbox for maximally permissive experimentation. In cases where more custom functions are needed, Simbrain provides a powerful scripting interface. In these ways, just about any modern neural network model at the level of abstraction of point neurons can be implemented in Simbrain.
Simbrain’s ease of use is based on its graphical interface, which allows individual neurons and synapses, and groups of neurons and synapses, to be created, edited, and connected together using a familiar point and click interface, as well as a set of keyboard shortcuts that, in our experience, are quickly mastered. One of the goals of Simbrain is to make neural networks as accessible (and manipulatable) as possible, in terms of their behavior, dynamics, analysis, and interactions with an environment. Users can modify parameters of running simulations and observe the effects of these interventions in real-time.
These features make it easy to use Simbrain in a broad range of educational settings, with students of different ages and levels of expertise. In fact, an informal design goal of the team is to make it possible for children and young adults to build and study neural networks. Simbrain has been used at the university level at the University of Sydney (Australia), LMU Munich (Germany), University of Indiana Bloomington (USA), and at the University of California, Merced (USA), among others, and the team has been seeking funding to develop educational modules for use in K-12 settings.
However, Simbrain is not exclusively an educational tool. It is also well-suited to research professionals developing more complex models. It has been used by the second author to study the dynamics of small networks embedded in virtual environments (Hotton and Yoshimi, 2011, Yoshimi, 2014), but with the improvements of 3.0 the possibilities have expanded. In as yet unpublished work, the first author has drawn on Simbrain’s enhanced performance capacities and extensibility to study the mechanisms of plasticity that are needed to account for the behavior and structure of cortical microcircuits observed in vitro and in vivo. In this kind of work, instantaneous visualizations of network data can inspire new hypotheses and lines of research, and make users immediately aware of information about a network which might otherwise go unnoticed, due to the often painstaking process of creating useful data visualizations. In fact, a chance observation of changes in synaptic weight distributions (via Simbrain’s histogram plot) was the impetus for the first author’s line of research.
Thus in a research/professional context this rich user feedback promotes a development cycle involving some or all of the following steps: (1) Come up with an idea for a novel ANN architecture. (2) Implement the idea using the Simbrain GUI (and possibly scripting). (3) Refine rapidly using the GUI’s visual feedback. (4) Fine tune the simulation and potentially develop it without the GUI, using Simbrain as a library.
Simbrain is programmed in Java and thus runs on any machine with a Java Virtual Machine (JRE/JDK 8 or higher required). Neural network graphics are based on the Piccolo library (http://www.cs.umd.edu/hcil/piccolo/), a zoomable scene-graph based library built on top of the Java2D API.2 The source code is hosted on git-hub (https://github.com/simbrain/simbrain) and efforts have been made to produce a clean, readable codebase.
Simbrain has been in continuous development since the late 1990s. Its original purpose was to provide a visually oriented framework for studying neural networks (few such programs were available at the time). Since then many more neural network packages have emerged, including several that feature rich visualization capacities (Aisa et al., 2008, Bekolay et al., 2013, Gewaltig and Diesmann, 2007, Goodman and Brette, 2008).3 Two of the most similar, in terms of providing a point-and-click GUI for creating neural networks, are Emergent and Nengo. However, these packages have been designed around specific computational frameworks (the Leabra framework in the case of Emergent; the Neural Engineering Framework in the case of Nengo). Though both frameworks are flexible (e.g. backpropagation is available in Emergent), each focuses on a specific class of neuro-computational models. A primary design goal of Simbrain, by contrast, has been to provide a domain general framework for creating and combining arbitrary neuron, synapse, and network models. Programs like Brian and Nest can produce detailed visualizations, but are not fundamentally GUI-based programs. With the emergence of HTML5, beautiful visualizations are more readily accessible directly in the browser, prompting some to create web-based interactive neural networks.4 However, thus far web-based visualizations have been oriented towards single demonstrations rather than general purpose simulation frameworks.
Simbrain’s basic graphical user interface is a desktop (see Fig. 1) with components (neural networks, virtual environments, graphs, etc.) that can be coupled to one another.5 Network panels allow users to build networks and watch the changing dynamics of neurons and (in some cases) synapses in real-time, alongside corresponding changes in graphs, tables, and virtual environments. The GUI can be disabled (or other GUI objects hidden) to increase a simulation’s computational speed and efficiency. Simbrain’s logical code is separate from its GUI code and it can be used as a library independently of the GUI.
The shift from Simbrain 2.0 to Simbrain 3.0 focused on scope and scalability/performance. Improved performance made larger-scale simulations possible, which in turn made it possible to add new network types and training algorithms. 3.0 introduces aggregated groups of neurons and synapses (“neuron groups” and “synapse groups”) which provide a convenient way to manipulate many neurons or synapses at once. Simbrain’s data analysis and visualization software has also been improved. Many of the enhancements have focused on supporting more biologically plausible neural network models, allowing for the simulation of structures like cortical microcircuits in Simbrain.
This article overviews the full scope of Simbrain 3.0’s capabilities and features. We first consider its core elements: neurons, synapses, plasticity rules, synaptic transmission rules, neuron groups, synapse groups, and subnetworks. We then consider its performance features, graphical interface design, worlds, and scripting. Examples of applications are given throughout. We end by discussing future directions for Simbrain.
Section snippets
Neurons and synapses
Simbrain supports a wide variety of model neurons (both connectionist and spiking), and rules for synaptic plasticity and transmission. The code was designed with flexibility and extendability in mind. Custom extensions are possible and encouraged. The easiest way to customize the code is via scripts (using beanshell, a type of interpreted Java), though efforts have also been made to make it easy to modify the code directly via the API.6
Future directions
A major goal for Simbrain 4.0 will be the introduction of parallel GPU computing (compare Brette & Goodman, 2012), most likely though JOCL (Java Bindings for Open CL; a framework for open source parallel computing) (Java bindings for opencl, 2015). Simbrain’s current performance benchmarks make it ideal for use by neural modelers studying network-level phenomena. However the jump to GPU computing will make larger scale networks possible in Simbrain on desktop PCs.
Other plans include more
Conclusion
Simbrain 3.0 is the most recent 10-year iteration of a development process which began over 15 years ago. Design goals have always focused on an intuitive interface and visualization, and an ability to develop arbitrary neural models. It was thus natural in 3.0 to expand the available network, neuron, and synapse models, and to allow for larger scale simulations. Doing this in a way that maintained Simbrain’s versatility and intuitive interface required a series of complex design choices and
Acknowledgments
This work was supported in part by funding from the National Science Foundation’s Integrative Graduate Education and Research Trainee (IGERT) program, at the University of Indiana Bloomington, Award Number: 0903495.
References (48)
- et al.
The emergent neural modeling system
Neural networks
(2008) Finding structure in time
Cognitive science
(1990)Impulses and physiological states in theoretical models of nerve membrane
Biophysical Journal
(1961)- et al.
Special issue on echo state networks and liquid state machines
Neural Networks
(2007) The self-organizing map
Neurocomputing
(1998)- et al.
Voltage oscillations in the barnacle giant muscle fiber
Biophysical Journal
(1981) - et al.
Improving liquid state machines through iterative refinement of the reservoir
Neurocomputing
(2010) - et al.
Evaluating java performance for linear algebra numerical computations
Procedia Computer Science
(2011) - et al.
Java in the high performance computing arena: research, practice and experience
Science of Computer Programming
(2013) - et al.
Fast, effective code generation in a just-in-time java compiler
Memory consolidation and the medial temporal lobe: a simple network model
Proceedings of the National Academy of Sciences
Nengo: a python tool for building large-scale functional brain models
Frontiers in neuroinformatics
Vehicles: Experiments in synthetic psychology
Adaptive exponential integrate-and-fire model as an effective description of neuronal activity
Journal of neurophysiology
Simulating spiking neural networks on gpu
Network: Computation in Neural Systems
NEST (neural simulation tool)
Scholarpedia
Stability versus neuronal specialization for STDP: long-tail weight distributions solve the dilemma
PLoS One
NeuroML
Java concurrency in practice
The brian simulator
Frontiers in neuroscience
The organization of behavior: A neuropsychological theory
Currents carried by sodium and potassium ions through the membrane of the giant axon of loligo
The Journal of physiology
Neural networks and physical systems with emergent collective computational abilities
Proceedings of the national academy of sciences
Cited by (14)
A competitive mechanism selecting verb-second versus verb-final word order in causative and argumentative clauses of spoken Dutch: A corpus-linguistic study
2018, Language SciencesCitation Excerpt :It is this second competition that accounts for the higher ratio of noncanonical vs. canonical weil clauses in German (almost 1:1) than in Dutch (about 1:15). Using the Simbrain3.0 neural-network software package (Tosi and Yoshimi, 2016), we designed a simulation of the model described here. Without going into details, we report here that runs with the simulation program yielded proportions of VF and V2 clauses that closely fit the Dutch and German proportions observed in the corpora.
A Comprehensive Survey on the Artificial Neural Network Open Source Software for Implementing Computational Intelligence
2023, AIP Conference ProceedingsDoctors in Medical Data Sciences: A New Curriculum
2023, International Journal of Environmental Research and Public HealthChallenges and Controversies in COVID-19: Masking the General Population may Attenuate This Pandemic's Outbreak
2021, Frontiers in Public HealthStability analysis and robust synchronisation of fractional-order modified Colpitts oscillators
2020, International Journal of Automation and ControlIn action with action potentials
2019, Journal of Physics: Conference Series