Siêu thị PDFTải ngay đi em, trời tối mất

Thư viện tri thức trực tuyến

Kho tài liệu với 50,000+ tài liệu học thuật

© 2023 Siêu thị PDF - Kho tài liệu học thuật hàng đầu Việt Nam

Tài liệu Independent component analysis P12 pdf
MIỄN PHÍ
Số trang
24
Kích thước
693.9 KB
Định dạng
PDF
Lượt xem
979

Tài liệu Independent component analysis P12 pdf

Nội dung xem thử

Mô tả chi tiết

12

ICA by Nonlinear

Decorrelation and

Nonlinear PCA

This chapter starts by reviewing some of the early research efforts in independent

component analysis (ICA), especially the technique based on nonlinear decorrelation,

that was successfully used by Jutten, Herault, and Ans to solve the first ICA problems. ´

Today, this work is mainly of historical interest, because there exist several more

efficient algorithms for ICA.

Nonlinear decorrelation can be seen as an extension of second-order methods

such as whitening and principal component analysis (PCA). These methods give

components that are uncorrelated linear combinations of input variables, as explained

in Chapter 6. We will show that independent components can in some cases be found

as nonlinearly uncorrelated linear combinations. The nonlinear functions used in

this approach introduce higher order statistics into the solution method, making ICA

possible.

We then show how the work on nonlinear decorrelation eventually lead to the

Cichocki-Unbehauen algorithm, which is essentially the same as the algorithm that

we derived in Chapter 9 using the natural gradient. Next, the criterion of nonlinear

decorrelation is extended and formalized to the theory of estimating functions, and

the closely related EASI algorithm is reviewed.

Another approach to ICA that is related to PCA is the so-called nonlinear PCA.

A nonlinear representation is sought for the input data that minimizes a least mean￾square error criterion. For the linear case, it was shown in Chapter 6 that principal

components are obtained. It turns out that in some cases the nonlinear PCA approach

gives independent components instead. We review the nonlinear PCA criterion and

show its equivalence to other criteria like maximum likelihood (ML). Then, two

typical learning rules introduced by the authors are reviewed, of which the first one

239

Independent Component Analysis. Aapo Hyvarinen, Juha Karhunen, Erkki Oja ¨

Copyright  2001 John Wiley & Sons, Inc.

ISBNs: 0-471-40540-X (Hardback); 0-471-22131-7 (Electronic)

240 ICA BY NONLINEAR DECORRELATION AND NONLINEAR PCA

is a stochastic gradient algorithm and the other one a recursive least mean-square

algorithm.

12.1 NONLINEAR CORRELATIONS AND INDEPENDENCE

The correlation between two random variables y and y was discussed in detail in

Chapter 2. Here we consider zero-mean variables only, so correlation and covariance

are equal. Correlation is related to independence in such a way that independent

variables are always uncorrelated. The opposite is not true, however: the variables

can be uncorrelated, yet dependent. An example is a uniform density in a rotated

square centered at the origin of the y y space, see e.g. Fig. 8.3. Both y and

y are zero mean and uncorrelated, no matter what the orientation of the square, but

they are independent only if the square is aligned with the coordinate axes. In some

cases uncorrelatedness does imply independence, though; the best example is the

case when the density of y y is constrained to be jointly gaussian.

Extending the concept of correlation, we here define the nonlinear correlation of

the random variables y and y as Eff ygyg. Here, f y and gy are two

functions, of which at least one is nonlinear. Typical examples might be polynomials

of degree higher than 1, or more complex functions like the hyperbolic tangent. This

means that one or both of the random variables are first transformed nonlinearly to

new variables f y gy and then the usual linear correlation between these new

variables is considered.

The question now is: Assuming that y and y are nonlinearly decorrelated in the

sense

Eff ygyg  (12.1)

can we say something about their independence? We would hope that by making

this kind of nonlinear correlation zero, independence would be obtained under some

additional conditions to be specified.

There is a general theorem (see, e.g., [129]) stating that y and y are independent

if and only if

Eff ygyg Eff ygEfgyg (12.2)

for all continuous functions f and g that are zero outside a finite interval. Based

on this, it seems very difficult to approach independence rigorously, because the

functions f and g are almost arbitrary. Some kind of approximations are needed.

This problem was considered by Jutten and Herault [228]. Let us assume that ´ f y

and gy are smooth functions that have derivatives of all orders in a neighborhood

Tải ngay đi em, còn do dự, trời tối mất!