where is a vector of non-degenerate statistically independent one dimensional random variables and is a matrix of full column rank. The linear structure is said to be essentially unique if all the linear decompositions are equivalent in the sense that if the vector variable allows for two structural representations
The main result in [Rao, 1969] is a decomposition theorem that states that if is a -variate random variable with a linear structure where all the elements of are non-Gaussian variables, then there does not exist a non-equivalent linear structure involving the same number or a smaller number of structural variables than that of .
Furthermore, if is a -vector random variable with a linear structure then can be decomposed
The proofs involve the characteristic functions of the vector random variables and and as such these results are applicable only if the number of observations (i.e. voxels) is sufficiently large to accurately reflect the distribution of these quantities.
The results show, however, that conditioned on knowing the number of source signals contained in the data and under the assumption that the data are generated according to equation 2, i.e. a linear mixture of independent non-Gaussian source signals confounded by Gaussian noise, there is no non-equivalent decomposition into this number of independent non-Gaussian random variables and an associated mixing matrix; the decomposition into independent components is unique, provided we do not attempt to extract more than source signals from the data.