跳到主要导航 跳到搜索 跳到主要内容

Nonlinear dimensionality reduction for discriminative analytics of multiple datasets

  • Jia Chen
  • , Gang Wang*
  • , Georgios B. Giannakis
  • *此作品的通讯作者

科研成果: 期刊稿件文章同行评审

摘要

Principal component analysis (PCA) is widely used for feature extraction and dimensionality reduction, with documented merits in diverse tasks involving high-dimensional data. PCA copes with one dataset at a time, but it is challenged when it comes to analyzing multiple datasets jointly. In certain data science settings however, one is often interested in extracting the most discriminative information from one dataset of particular interest (a.k.a. target data) relative to the other(s) (a.k.a. background data). To this end, this paper puts forth a novel approach, termed discriminative (d) PCA, for such discriminative analytics of multiple datasets. Under certain conditions, dPCA is proved to be least-squares optimal in recovering the latent subspace vector unique to the target data relative to background data. To account for nonlinear data correlations, (linear) dPCA models for one or multiple background datasets are generalized through kernel-based learning. Interestingly, all dPCA variants admit an analytical solution obtainable with a single (generalized) eigenvalue decomposition. Finally, substantial dimensionality reduction tests using synthetic and real datasets are provided to corroborate the merits of the proposed methods.

源语言英语
文章编号8565879
页(从-至)740-752
页数13
期刊IEEE Transactions on Signal Processing
67
3
DOI
出版状态已出版 - 1 2月 2019
已对外发布

指纹

探究 'Nonlinear dimensionality reduction for discriminative analytics of multiple datasets' 的科研主题。它们共同构成独一无二的指纹。

引用此