TY - GEN
T1 - Canonical Correlation Analysis with Common Graph Priors
AU - Chen, Jia
AU - Wang, Gang
AU - Shen, Yanning
AU - Giannakis, Georgios B.
N1 - Publisher Copyright:
© 2018 IEEE.
PY - 2018/8/29
Y1 - 2018/8/29
N2 - Canonical correlation analysis (CCA) is a well-appreciated linear subspace method to leverage hidden sources common to two or more datasets. CCA benefits are documented in various applications, such as dimensionality reduction, blind source separation, classification, and data fusion. However, the standard CCA does not exploit the geometry of common sources, which may be deduced from (cross-) correlations, or, inferred from the data. In this context, the prior information provided by the common source is encoded here through a graph, and is employed as a CCA regularizer. This leads to what is termed here as graph CCA (gCCA), which accounts for the graph-induced knowledge of common sources, while maximizing the linear correlation between the canonical variables. When the dimensionality of data vectors is high relative to the number of vectors, the dual formulation of the novel gCCA is also developed. Tests on two real datasets for facial image classification showcase the merits of the proposed approaches relative to their competing alternatives.
AB - Canonical correlation analysis (CCA) is a well-appreciated linear subspace method to leverage hidden sources common to two or more datasets. CCA benefits are documented in various applications, such as dimensionality reduction, blind source separation, classification, and data fusion. However, the standard CCA does not exploit the geometry of common sources, which may be deduced from (cross-) correlations, or, inferred from the data. In this context, the prior information provided by the common source is encoded here through a graph, and is employed as a CCA regularizer. This leads to what is termed here as graph CCA (gCCA), which accounts for the graph-induced knowledge of common sources, while maximizing the linear correlation between the canonical variables. When the dimensionality of data vectors is high relative to the number of vectors, the dual formulation of the novel gCCA is also developed. Tests on two real datasets for facial image classification showcase the merits of the proposed approaches relative to their competing alternatives.
KW - Canonical correlations
KW - dimensionality reduction
KW - generalized eigenvalue
KW - signal processing over graphs
UR - http://www.scopus.com/inward/record.url?scp=85051170977&partnerID=8YFLogxK
U2 - 10.1109/SSP.2018.8450749
DO - 10.1109/SSP.2018.8450749
M3 - Conference contribution
AN - SCOPUS:85051170977
SN - 9781538615706
T3 - 2018 IEEE Statistical Signal Processing Workshop, SSP 2018
SP - 463
EP - 467
BT - 2018 IEEE Statistical Signal Processing Workshop, SSP 2018
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 20th IEEE Statistical Signal Processing Workshop, SSP 2018
Y2 - 10 June 2018 through 13 June 2018
ER -