Dependence, Correlation and Gaussianity in Independent Component Analysis

Jean-François Cardoso; 4(Dec):1177-1203, 2003.

Abstract

Independent component analysis (ICA) is the decomposition of a random vector in linear components which are "as independent as possible." Here, "independence" should be understood in its strong statistical sense: it goes beyond (second-order) decorrelation and thus involves the non-Gaussianity of the data. The ideal measure of independence is the "mutual information" and is known to be related to the entropy of the components when the search for components is restricted to uncorrelated components. This paper explores the connections between mutual information, entropy and non-Gaussianity in a larger framework, without resorting to a somewhat arbitrary decorrelation constraint. A key result is that the mutual information can be decomposed, under linear transforms, as the sum of two terms: one term expressing the decorrelation of the components and one expressing their non-Gaussianity.

Our results extend the previous understanding of these connections and explain them in the light of information geometry. We also describe the "local geometry" of ICA by re-expressing all our results via a Gram-Charlier expansion by which all quantities of interest are obtained in terms of cumulants.

[abs][pdf][ps.gz][ps]