# Estimating Independence

### Kurtosis

Kurtosis is the classical method of measuring
nongaussianity. When data is preprocessed to have unit variance, kurtosis is
equal to the fourth moment of the data. In an intuitive sense, kurtosis
measured how "spikiness" of a distribution or the size of the tails. Kurtosis
is extremely simple to calculate, however, it is very sensitive to outliers in
the data set. It values may be based on only a few values in the tails which
means that its statistical significance is poor. Kurtosis is not robust enough
for ICA.

### Negentropy

The entropy of a discrete signal is equal to the sum of the
products of probability of each event and the log of those probabilities. A
value called differential entropy can be found for continuous function that
uses the integral of the function times the log of the function. Negentropy is
simply the differential entropy of a signal y, minus the differential entropy
of a gaussian signal with the same covariance of y. Negentropy is always
positive and is zero only if the signal is a pure gaussian signal. It is stable
but difficult to calculate.

### Negentropy Approximation

An approach is used to approximate negentropy in a way that
is computationally less expensive than calculating negentropy directly, but
still more stable than kurtosis. The following equation approximates
negentropy.

In this equation G(x) is some nonquadratic function, v is a gaussian variable with unit
variance and zero mean, and k_{i} is some constant value. If G(x) = x^{4}
this equation becomes equal to kurtosis. There are functions that can be used
for G that give a good approximation to negentropy and are less sensitive to
outliers than kurtosis. Two commonly used functions are:

These are called contrast functions.

Up

Prev
Next