Random variables A and B are independent if the conditional probability of A with respect to A is just the probability of A. In other words, knowing a value of B tells us nothing about A. This can be expressed as the equation:
P(A|B) = P(A)
Since P(A|B) = P(A,B)/P(B), where P(A,B) is the joint density function of A and B.
P(A,B) = P(A)*P(B)
Another important feature of statistical independence is
mean(g1(A)g2(B)) = mean(g1(A)) mean(g2(B))
for any functions g1 and g2 and A != B.
We can also look at the covariance between A and B.
cov(A,B) = mean(A*B) != mean(A)*mean(B)
For a random variable A, cov(A,A) is equal to the variance of A. If A and B are independent, then
mean(A*B) = mean(A)*mean(B)
and the covariance will be zero. The covariance of two statistically independent variables is always zero. The converse is not always true. Just because the covariance is zero does not mean A and B are independent. However, in the special case of Gaussian variables, zero covariance does imply independence. This feature of Gaussian variables is used to find columns of W in WX=S.
Previously we stated that each measured signal in X is a linear combination of the independent signals in S. The mixing matrix A is invertible such that A-1=W. Each of the dependent components in S can also be expressed as a linear combination of the measured signals in X (S=WX).
The Central Limit Theorem states that the sum of several independent random variables, such as those in S, tends towards a Gaussian distribution. So xi = a1s1 + a2s2 is more gaussian than either s1 or s2. For example, the sum of a pair of dice approximates a gaussian distribution with a mean of seven.
The Central Limit Theorem implies that if we can find a combination of the measured signals in X with minimal gaussian properties, then that signal will be one of the independent signals. Once W is determined it is a simple matter to invert it to find A.
In order to find this signal, some way to measure the nongaussianity of wX is needed. There are several ways to do this.