Entropy and information provide natural measures of correlation among elements in a network. We construct here the information theoretic analog of connected correlation functions: irreducible N-point correlation is measured by a decrease in entropy for the joint distribution of N variables relative to the maximum entropy allowed by all the observed N-1 variable distributions. We calculate the "connected information" terms for several examples and show that it also enables the decomposition of the information that is carried by a population of elements about an outside source.