Independent component analysis: algorithms and
applications
A. Hyvärinen, and E. Oja. Neural Networks: The Official Journal of the
International Neural Network Society, 13 (4-5):
411--430(June 2000)PMID: 10946390.
DOI: 10.1016/S0893-6080(00)00026-5
Abstract
A fundamental problem in neural network research, as
well as in many other disciplines, is finding a
suitable representation of multivariate data, i.e.
random vectors. For reasons of computational and
conceptual simplicity, the representation is often
sought as a linear transformation of the original data.
In other words, each component of the representation is
a linear combination of the original variables.
Well-known linear transformation methods include
principal component analysis, factor analysis, and
projection pursuit. Independent component analysis
(ICA) is a recently developed method in which the
goal is to find a linear representation of
non-Gaussian data so that the components are
statistically independent, or as independent as
possible. Such a representation seems to capture the
essential structure of the data in many applications,
including feature extraction and signal separation. In
this paper, we present the basic theory and
applications of ICA, and our recent work on the
subject.
%0 Journal Article
%1 hyvarinen-independent-component-analysis-2000
%A Hyvärinen, A
%A Oja, E
%D 2000
%J Neural Networks: The Official Journal of the
International Neural Network Society
%K hebbian ica learning network neural
%N 4-5
%P 411--430
%R 10.1016/S0893-6080(00)00026-5
%T Independent component analysis: algorithms and
applications
%U http://www.cs.helsinki.fi/u/ahyvarin/papers/NN00new.pdf
%V 13
%X A fundamental problem in neural network research, as
well as in many other disciplines, is finding a
suitable representation of multivariate data, i.e.
random vectors. For reasons of computational and
conceptual simplicity, the representation is often
sought as a linear transformation of the original data.
In other words, each component of the representation is
a linear combination of the original variables.
Well-known linear transformation methods include
principal component analysis, factor analysis, and
projection pursuit. Independent component analysis
(ICA) is a recently developed method in which the
goal is to find a linear representation of
non-Gaussian data so that the components are
statistically independent, or as independent as
possible. Such a representation seems to capture the
essential structure of the data in many applications,
including feature extraction and signal separation. In
this paper, we present the basic theory and
applications of ICA, and our recent work on the
subject.
@article{hyvarinen-independent-component-analysis-2000,
abstract = {A fundamental problem in neural network research, as
well as in many other disciplines, is finding a
suitable representation of multivariate data, i.e.
random vectors. For reasons of computational and
conceptual simplicity, the representation is often
sought as a linear transformation of the original data.
In other words, each component of the representation is
a linear combination of the original variables.
Well-known linear transformation methods include
principal component analysis, factor analysis, and
projection pursuit. Independent component analysis
{(ICA)} is a recently developed method in which the
goal is to find a linear representation of
{non-Gaussian} data so that the components are
statistically independent, or as independent as
possible. Such a representation seems to capture the
essential structure of the data in many applications,
including feature extraction and signal separation. In
this paper, we present the basic theory and
applications of {ICA,} and our recent work on the
subject.},
added-at = {2010-07-13T13:28:23.000+0200},
author = {Hyvärinen, A and Oja, E},
biburl = {https://www.bibsonomy.org/bibtex/2470a9bd2d785259d08debabf0e89a1db/mhwombat},
doi = {10.1016/S0893-6080(00)00026-5},
file = {:/home/amy/taighde/docs/neural_nets/ICA__NN00new.pdf:PDF},
groups = {public},
interhash = {2de559ac29677b060fba7b19f35d66e3},
intrahash = {470a9bd2d785259d08debabf0e89a1db},
issn = {0893-6080},
journal = {Neural Networks: The Official Journal of the
International Neural Network Society},
keywords = {hebbian ica learning network neural},
month = jun,
note = {{PMID:} 10946390},
number = {4-5},
pages = {411--430},
timestamp = {2016-07-12T19:25:30.000+0200},
title = {Independent component analysis: algorithms and
applications},
url = {http://www.cs.helsinki.fi/u/ahyvarin/papers/NN00new.pdf},
username = {mhwombat},
volume = 13,
year = 2000
}