How inhibitory oscillations can train neural networks and punish competitors

Neural Comput. 2006 Jul;18(7):1577-610. doi: 10.1162/neco.2006.18.7.1577.

Abstract

We present a new learning algorithm that leverages oscillations in the strength of neural inhibition to train neural networks. Raising inhibition can be used to identify weak parts of target memories, which are then strengthened. Conversely, lowering inhibition can be used to identify competitors, which are then weakened. To update weights, we apply the Contrastive Hebbian Learning equation to successive time steps of the network. The sign of the weight change equation varies as a function of the phase of the inhibitory oscillation. We show that the learning algorithm can memorize large numbers of correlated input patterns without collapsing and that it shows good generalization to test patterns that do not exactly match studied patterns.

Publication types

  • Research Support, N.I.H., Extramural

MeSH terms

  • Algorithms
  • Animals
  • Computer Simulation
  • Humans
  • Learning / physiology*
  • Models, Neurological
  • Nerve Net / physiology*
  • Neural Inhibition / physiology*
  • Neural Networks, Computer*
  • Periodicity*
  • Punishment*
  • Time Factors