Information theory and neural coding

Nat Neurosci. 1999 Nov;2(11):947-57. doi: 10.1038/14731.

Abstract

Information theory quantifies how much information a neural response carries about the stimulus. This can be compared to the information transferred in particular models of the stimulus-response function and to maximum possible information transfer. Such comparisons are crucial because they validate assumptions present in any neurophysiological analysis. Here we review information-theory basics before demonstrating its use in neural coding. We show how to use information theory to validate simple stimulus-response models of neural coding of dynamic stimuli. Because these models require specification of spike timing precision, they can reveal which time scales contain information in neural coding. This approach shows that dynamic stimuli can be encoded efficiently by single neurons and that each spike contributes to information transmission. We argue, however, that the data obtained so far do not suggest a temporal code, in which the placement of spikes relative to each other yields additional information.

Publication types

  • Review

MeSH terms

  • Algorithms
  • Animals
  • Entropy
  • Humans
  • Information Theory*
  • Linear Models
  • Nerve Net / physiology*
  • Neurons / physiology
  • Probability
  • Reproducibility of Results