Orthogonal multi-view analysis by successive approximations via eigenvectors

Li Wang*, Lei Hong Zhang, Chungen Shen, Ren Cang Li

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

Orthogonality has been demonstrated to admit many desirable properties such as noise-tolerant, good for data visualization, and preserving distances. However, it is often incompatible with existing models and the resulting optimization problem is challenging even if compatible. To address these issues, we propose a trace ratio formulation for multi-view subspace learning to learn individual orthogonal projections for all views. The proposed formulation integrates the correlations within multiple views, supervised discriminant capacity, and distance preservation in a concise and compact way. It not only includes several existing models as special cases, but also inspires new models. Moreover, an efficient numerical method based on successive approximations via eigenvectors is presented to solve the associated optimization problem. The method is built upon an iterative Krylov subspace method which can easily scale up for high-dimensional datasets. Extensive experiments are conducted on various real-world datasets for multi-view discriminant analysis and multi-view multi-label classification. The experimental results demonstrate that the proposed models are consistently competitive to and often better than the compared methods.

Original languageEnglish
Pages (from-to)100-116
Number of pages17
JournalNeurocomputing
Volume512
DOIs
Publication statusPublished - 1 Nov 2022
Externally publishedYes

Keywords

  • Krylov subspace method
  • Orthogonal multi-view analysis
  • successive approximations via eigenvectors

Fingerprint

Dive into the research topics of 'Orthogonal multi-view analysis by successive approximations via eigenvectors'. Together they form a unique fingerprint.

Cite this