TY - GEN
T1 - Incremental Discriminant Learning for Heterogeneous Domain Adaptation
AU - Han, Peng
AU - Wu, Xinxiao
N1 - Publisher Copyright:
© 2015 IEEE.
PY - 2016/1/29
Y1 - 2016/1/29
N2 - This paper proposes a new incremental learning method for heterogeneous domain adaptation, in which the training data from both source domain and target domains are acquired sequentially, represented by heterogeneous features. Two different projection matrices are learned to map the data from two domains into a discriminative common subspace, where the intra-class samples are closely-related to each other, the inter-class samples are well-separated from each other, and the data distribution mismatch between the source and target domains is reduced. Different from previous work, our method is capable of incrementally optimizing the projection matrices when the training data becomes available as a data stream instead of being given completely in advance. With the gradually coming training data, the new projection matrices are computed by updating the existing ones using an eigenspace merging algorithm, rather than repeating the learning from the begin by keeping the whole training data set. Therefore, our incremental learning solution for the projection matrices can significantly reduce the computational complexity and memory space, which makes it applicable to a wider set of heterogeneous domain adaptation scenarios with a large training dataset. Furthermore, our method is neither restricted to the corresponding training instances in the source and target domains nor restricted to the same type of feature, which meaningfully relaxes the requirement of training data. Comprehensive experiments on three benchmark datasets clearly demonstrate the effectiveness and efficiency of our method.
AB - This paper proposes a new incremental learning method for heterogeneous domain adaptation, in which the training data from both source domain and target domains are acquired sequentially, represented by heterogeneous features. Two different projection matrices are learned to map the data from two domains into a discriminative common subspace, where the intra-class samples are closely-related to each other, the inter-class samples are well-separated from each other, and the data distribution mismatch between the source and target domains is reduced. Different from previous work, our method is capable of incrementally optimizing the projection matrices when the training data becomes available as a data stream instead of being given completely in advance. With the gradually coming training data, the new projection matrices are computed by updating the existing ones using an eigenspace merging algorithm, rather than repeating the learning from the begin by keeping the whole training data set. Therefore, our incremental learning solution for the projection matrices can significantly reduce the computational complexity and memory space, which makes it applicable to a wider set of heterogeneous domain adaptation scenarios with a large training dataset. Furthermore, our method is neither restricted to the corresponding training instances in the source and target domains nor restricted to the same type of feature, which meaningfully relaxes the requirement of training data. Comprehensive experiments on three benchmark datasets clearly demonstrate the effectiveness and efficiency of our method.
UR - http://www.scopus.com/inward/record.url?scp=84964765668&partnerID=8YFLogxK
U2 - 10.1109/ICDMW.2015.186
DO - 10.1109/ICDMW.2015.186
M3 - Conference contribution
AN - SCOPUS:84964765668
T3 - Proceedings - 15th IEEE International Conference on Data Mining Workshop, ICDMW 2015
SP - 1213
EP - 1220
BT - Proceedings - 15th IEEE International Conference on Data Mining Workshop, ICDMW 2015
A2 - Wu, Xindong
A2 - Tuzhilin, Alexander
A2 - Xiong, Hui
A2 - Dy, Jennifer G.
A2 - Aggarwal, Charu
A2 - Zhou, Zhi-Hua
A2 - Cui, Peng
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 15th IEEE International Conference on Data Mining Workshop, ICDMW 2015
Y2 - 14 November 2015 through 17 November 2015
ER -