Elsevier

Neurocomputing

Volume 122, 25 December 2013, Pages 324-329
Neurocomputing

Computational capability of liquid state machines with spike-timing-dependent plasticity

https://doi.org/10.1016/j.neucom.2013.06.019Get rights and content

Abstract

Liquid state machine (LSM) is a recently developed computational model with a reservoir of recurrent spiking neural network (RSNN). This model has shown to be beneficial for performing computational tasks. In this paper, we present a novel type of LSM with self-organized RSNN instead of the traditional RSNN with random structure. Here, the spike-timing-dependent plasticity (STDP) which has been broadly observed in neurophysiological experiments is employed for the learning update of RSNN. Our computational results show that this model can carry out a class of biologically relevant real-time computational tasks with high accuracy. By evaluating the average mean squared error (MSE), we find that LSM with STDP learning is able to lead to a better performance than LSM with random reservoir, especially for the case of partial synaptic connections.

Introduction

Liquid state machine (LSM) is a recently developed neural network model that has gained increasing attention during the past decade [1], [2], [3]. Proposed by Maass et al. [1], LSM is a new form of computational model which is capable of conducting universal computations [2], [3]. This model has three main parts, an input component (IC), a liquid component (LC), and a readout component (RC) [1]. LC acts as a medium through which the input can be expressed in a higher dimensional form called liquid state. For different tasks, the liquid states can be transformed into different forms through RC [4]. Typically, neural microcircuits are taken as implementations of LC, which receives one or several input spike trains and facilitate the projection of input into a higher dimensional space. Instead of online updating all of the synaptic conductivities in most of the traditional recurrent neural networks (RNN), synaptic weights of recurrent connections in LC are usually chosen randomly and fixed during the training process; while only the weights from neurons in LC to neurons in RC are trained by learning algorithm according to specific tasks [1], [3]. This kind of RNN design, including the Echo State Networks proposed by Herbert Jaeger [5], is often referred as Reservoir Computing. Many studies have shown that this kind of neural network model is capable of performing various computational tasks with high accuracy and low computational cost [1], [3], [5].

Since neural network is the essential part in the implementation of LC, synaptic connections or network structure is closely related to the performance of the computational capability of LSM. Considering the modeling of neural networks, various topologies have been investigated, such as globally coupled networks [6], small-world networks [7], [8], and scale-free networks [9]. Contrast to these predefined networks, self-organized neural networks [10], [11], [12], [13] are more reasonable to be considered. The self-organization is usually managed through spike-timing dependent plasticity (STDP), which is a form of long-term synaptic plasticity both experimentally observed [14] and theoretically studied [15], [16]. It has been broadly found in many neocortical layers and brain regions [17], [18], [19]. In [20], bidirectional and unidirectional connections developed from STDP learning can reflect different neural codes. Actually, the STDP explores the possible casual relationship between each pair of pre- and post-synaptic neurons [21].

Recently, LSM with STDP learning has been studied. In [22], STDP is used to train readout neurons so that a single neuron can recognize the state of LC according to the input signal. Computational capability can also be improved by combining STDP and Intrinsic Plasticity (IP) for reshaping of the network structure [23]. Besides, Hebbian Learning and STDP can be applied in LC to improve the separation property when LSM is used to deal with real-world speech data [24]. However, in these studies they did not consider the relationship between the intrinsic neural dynamics and synaptic plasticity of neural network. Thus the updated network with STDP learning rule is somewhat determined by initial conditions, and cannot fully take advantage of the internal dynamical states of neuronal population.

In papers [21], [25] the authors proposed a novel neural network with active-neuron-dominant structure, which is self-organized via the STDP learning rule. In this model, strong connections are mainly distributed to the outward links of a few highly active neurons. Besides, a recent experimental study [26] found that a small population of highly active neurons may dominate the firing in neocortical networks, suggesting the existence of active-neuron-dominant connectivity in the neocortex. Such synapse distribution has shown to be beneficial for enhancing information transmission of neural circuits [21], [25].

Based on these previous studies, here we apply this novel active-neuron-dominant structure presented in [25], [21] to the neural network of LC and investigate its computational capability. In this model, neurons in LC are equally divided into four groups. Synaptic connections in each group are active-neuron-dominant, which are developed from the STDP rule before the training of readout weights for computational tasks. The computational capability of LSM with different settings of LC structure has been examined and evaluated by applying the average MSE. Our results show that LSM with STDP has a better performance than LSM with random LC for these computational tasks, which indicates the significant influence of STDP learning on the computational capability of LSM. In our model, the internal dynamics of neurons with different degrees of excitability are clearly extracted into an active-neuron-dominant topology after the STDP process, which contributes to the synchronous spiking behavior of LC network. It is the highly synchronous network activity in LC that contributes to the improvement of the computational performance of LSM system.

Section snippets

Network architecture

In our model, 400 Izhikevich neurons of LC are divided into four independent groups equally (see Fig. 1). Before computations of specific tasks, synapses between neurons in each group are firstly updated by the STDP learning rule in order to obtain the active-neuron-dominant network structure as described below. IC includes four independent input streams, each consists of eight spike trains generated by the Poisson process with randomly varying rates ri(t),i=1,,4 (more details are given in

Computational capability of LSM with STDP

To investigate the computational capability of LSM with the active-neuron-dominant structure of LC, we examined a class of biologically relevant real-time computational tasks. These computational tasks are designed to test the computational capability of LSM to approach the given target signal with high accuracy. All simulations are carried out with the software package CSIM [28].

Here, four input streams are generated randomly by the Poisson process, each consisting of eight spike trains (shown

Effect of connection styles between groups in LC

In the model of this paper, there are four groups in LC where each one has 100 neurons and receive one input stream individually. Here, we examined the influence of connection styles between these four groups on the performance of computation. Two versions of LSM with different connection types from input circuits to LC have been investigated. One is the full connection type, where spike trains in each input are fully applied to all of the neurons in the corresponding LC group, called STDP-full

Comparisons of computational capability between STDP and random network

In this section, comparisons of computational capability between LSMs where synapse weights in LC are updated by STDP or generated randomly are investigated. We examined different synaptic weights from IC to LC and gave computational capability results in Fig. 7. It shows that average MSE for LSM with STDP is much smaller than the random one for the whole range of synaptic weights. This demonstrates that the STDP refinement of synaptic connectivity in LC is beneficial for improving the

Conclusions

In this paper, a novel type of LSM is obtained via STDP learning. Due to the existence of the self-organization of synaptic connections in LC, LSM has a better computational capability for real-time computations on complex input streams than the traditional LSM model with random LC. The internal dynamics of neurons with different degrees of excitability are clearly extracted into an active-neuron-dominant topology after the STDP process, which contributes to the synchronous spiking behavior of

Fangzheng Xue received his Bachelor, Master, and PhD degree in 1999, 2003, and 2005 from Northeastern University, China. Currently, he is an Associate Professor in the College of Automation, Chongqing University, Chongqing, China. His research interests include computational neuroscience, bio-inspired robotics, and intelligent control.

References (28)

  • W. Maass, Liquid computing, in: Computation and Logic in the Real World, Springer-Verlag Berlin and Heidelberg GmbH &...
  • D.H. Zanette et al.

    Mutual synchronization in ensembles of globally coupled neural networks

    Phys. Rev. E

    (1998)
  • L. Morelli et al.

    Associative memory on a small-world neural network

    Eur. Phys. J. B—Condens. Matter Complex Syst.

    (2004)
  • D. Stauffer et al.

    Efficient Hopfield pattern recognition on a scale-free neural network

    Eur. Phys. J. B—Condens. Matter Complex Syst.

    (2003)
  • Cited by (33)

    • Computational modeling of spiking neural network with learning rules from STDP and intrinsic plasticity

      2018, Physica A: Statistical Mechanics and its Applications
      Citation Excerpt :

      Hebbian learning can be applied in SNN to improve the separation property when LSM is used to deal with real-world speech data [11]. In our previous work [12], a novel type of LSM was obtained via STDP learning. The heterogeneity in the excitability degrees of neurons caused the SNN to evolve into a sparse and active neuron-dominant structure, wherein strong connections were mainly distributed to the outward links of a few highly active neurons.

    • Brain-inspired neural circuit evolution for spiking neural networks

      2023, Proceedings of the National Academy of Sciences of the United States of America
    • Brain-Inspired Spatiotemporal Processing Algorithms for Efficient Event-Based Perception

      2023, Proceedings -Design, Automation and Test in Europe, DATE
    View all citing articles on Scopus

    Fangzheng Xue received his Bachelor, Master, and PhD degree in 1999, 2003, and 2005 from Northeastern University, China. Currently, he is an Associate Professor in the College of Automation, Chongqing University, Chongqing, China. His research interests include computational neuroscience, bio-inspired robotics, and intelligent control.

    Zhicheng Hou received his Bachelor degree in 2010 from the College of Automation, Chongqing University, Chongqing, China. Currently, he is master student in the College of Automation, Chongqing University, Chongqing, China. His research interests include computational neuroscience and bio-inspired Robotics.

    Xiumin Li received the Bachelor degree in Automation from Taiyuan University of Technology, China, in 2005, the Master degree in Automation from Tianjin University, China, in 2007, and the PhD degree from Hong Kong Polytechnic University, Hong Kong, in 2011. She is currently a full-time Lecturer in the College of Automation, Chongqing University, Chongqing, China. Her research interests include computational neuroscience, neurodynamic analysis, neural network modeling, and their implications to intelligent control.

    The paper is supported by the Major State Basic Research Development Program 973 (No. 2012CB215202), the National Natural Science Foundation of China (No. 60905053), Chongqing Natural Science Foundation (No. 2011BB0081), and the Fundamental Research Funds for the Central Universities (No. CDJZR11170006).

    View full text