Abstract
We present an unsupervised multi-view partial least squares (PLS) by learning a common latent space from given multi-view data. Although PLS is a frequently used technique for analyzing relationships between two datasets, its extension to more than two views in unsupervised setting is seldom studied. In this article, we fill up the gap, and our model bears similarity to the extension of canonical correlation analysis (CCA) to more than two sets of variables and is built on the findings from analyzing PLS, CCA, and its variants. The resulting problem involves a set of orthogonality constraints on view-specific projection matrices, and is numerically challenging to existing methods that may have numerical instabilities and offer no orthogonality guarantee on view-specific projection matrices. To solve this problem, we propose a stable deflation algorithm that relies on proven numerical linear algebra techniques, can guarantee the orthogonality constraints, and simultaneously maximizes the covariance in the common space. We further adapt our algorithm to efficiently handle large-scale high-dimensional data. Extensive experiments have been conducted to evaluate the algorithm through performing two learning tasks, cross-modal retrieval, and multi-view feature extraction. The results demonstrate that the proposed algorithm outperforms the baselines and is scalable for large-scale high-dimensional datasets.
Original language | English |
---|---|
Pages (from-to) | 1073-1083 |
Number of pages | 11 |
Journal | IEEE Transactions on Big Data |
Volume | 8 |
Issue number | 4 |
DOIs | |
Publication status | Published - 1 Aug 2022 |
Externally published | Yes |
Keywords
- Multi-view learning
- Partial least squares
- Unsupervised subspace learning