Sol Golomb's 60th birthday symposium
A unifying information-theoretic framework for independent component analysis

https://doi.org/10.1016/S0898-1221(00)00101-2Get rights and content
Under an Elsevier user license
open archive

Abstract

We show that different theories recently proposed for independent component analysis (ICA) lead to the same iterative learning algorithm for blind separation of mixed independent sources. We review those theories and suggest that information theory can be used to unify several lines of research. Pearlmutter and Parra [1] and Cardoso [2] showed that the infomax approach of Bell and Sejnowski [3] and the maximum likelihood estimation approach are equivalent. We show that negentropy maximization also has equivalent properties, and therefore, all three approaches yield the same learning rule for a fixed nonlinearity. Girolami and Fyfe [4] have shown that the nonlinear principal component analysis (PCA) algorithm of Karhunen and Joutsensalo [5] and Oja [6] can also be viewed from information-theoretic principles since it minimizes the sum of squares of the fourth-order marginal cumulants, and therefore, approximately minimizes the mutual information [7]. Lambert [8] has proposed different Bussgang cost functions for multichannel blind deconvolution. We show how the Bussgang property relates to the infomax principle. Finally, we discuss convergence and stability as well as future research issues in blind source separation.

Keywords

Blind source separation
ICA
Entropy
Information maximization
Maximum likelihood estimation

Cited by (0)

Lee was supported by the Office of Naval Research. Girolami was supported by a grant from NCR Financial Systems (Ltd), Knowledge Laboratory, Advanced Technology Development Division, Dundee, Scotland. Bell and Sejnowski were supported by the Howard Hughes Medical Institute and the Office of Naval Research.