Speaker
Hannu Oja
(University of Turku)
Description
In independent component analysis it is assumed that the observed random variables are linear combinations of latent, mutually independent random variables called the independent components. It is then often thought that only the non-Gaussian independent components are of interest and the Gaussian components simply present noise. The idea is then to make inference on the unknown number of non-Gaussian components and to estimate the transformations back to the non-Gaussian components.
In this talk we show how the classical skewness and kurtosis measures, namely third and fourth cumulants, can be used in the estimation. First, univariate cumulants are used as projection indices
in search for independent components (projection pursuit, fastICA). Second, multivariate fourth cumulant matrices are jointly used to solve the problem (FOBI, JADE). The properties of the estimates are considered through corresponding optimization problems, estimating equations, algorithms and asymptotic statistical properties. The theory is illustrated with several examples.