TY - GEN
T1 - Unsupervised Active Learning via Subspace Learning
AU - Li, Changsheng
AU - Mao, Kaihang
AU - Liang, Lingyan
AU - Ren, Dongchun
AU - Zhang, Wei
AU - Yuan, Ye
AU - Wang, Guoren
N1 - Publisher Copyright:
Copyright © 2021, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rights reserved.
PY - 2021
Y1 - 2021
N2 - Unsupervised active learning has been an active research topic in machine learning community, with the purpose of choosing representative samples to be labelled in an unsupervised manner. Previous works usually take the minimization of data reconstruction loss as the criterion to select representative samples, by which the original inputs can be better approximated. However, data are often drawn from low-dimensional subspaces embedded in an arbitrary high-dimensional space in many scenarios, thus it might severely bring in noise if attempting to precisely reconstruct all entries of one observation, leading to a suboptimal solution. In view of this, this paper proposes a novel unsupervised Active Learning model via Subspace Learning, called ALSL. In contrast to previous approaches, ALSL aims to discover low-rank structures of data, and then perform sample selection based on the learnt low-rank representations. To this end, we devise two different strategies and propose two corresponding formulations to select samples with and under low-rank sample representations, respectively. Since the proposed formulations involve several non-smooth regularization terms, we develop a simple but effective optimization procedure to solve them. Extensive experiments are performed on five publicly available datasets, and experimental results demonstrate the proposed first formulation achieves comparable performance with the state-of-the-arts, while the second formulation significantly outperforms them, achieving a 13% improvement over the second best baseline at most.
AB - Unsupervised active learning has been an active research topic in machine learning community, with the purpose of choosing representative samples to be labelled in an unsupervised manner. Previous works usually take the minimization of data reconstruction loss as the criterion to select representative samples, by which the original inputs can be better approximated. However, data are often drawn from low-dimensional subspaces embedded in an arbitrary high-dimensional space in many scenarios, thus it might severely bring in noise if attempting to precisely reconstruct all entries of one observation, leading to a suboptimal solution. In view of this, this paper proposes a novel unsupervised Active Learning model via Subspace Learning, called ALSL. In contrast to previous approaches, ALSL aims to discover low-rank structures of data, and then perform sample selection based on the learnt low-rank representations. To this end, we devise two different strategies and propose two corresponding formulations to select samples with and under low-rank sample representations, respectively. Since the proposed formulations involve several non-smooth regularization terms, we develop a simple but effective optimization procedure to solve them. Extensive experiments are performed on five publicly available datasets, and experimental results demonstrate the proposed first formulation achieves comparable performance with the state-of-the-arts, while the second formulation significantly outperforms them, achieving a 13% improvement over the second best baseline at most.
UR - http://www.scopus.com/inward/record.url?scp=85127795967&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85127795967
T3 - 35th AAAI Conference on Artificial Intelligence, AAAI 2021
SP - 8332
EP - 8339
BT - 35th AAAI Conference on Artificial Intelligence, AAAI 2021
PB - Association for the Advancement of Artificial Intelligence
T2 - 35th AAAI Conference on Artificial Intelligence, AAAI 2021
Y2 - 2 February 2021 through 9 February 2021
ER -