TY - GEN
T1 - Class Incremental Learning with Important and Diverse Memory
AU - Li, Mei
AU - Yan, Zeyu
AU - Li, Changsheng
N1 - Publisher Copyright:
© The Author(s), under exclusive license to Springer Nature Switzerland AG 2023.
PY - 2023
Y1 - 2023
N2 - Class incremental learning (CIL) has been attracting increasing attention in computer vision and machine learning communities, where a well-known issue is catastrophic forgetting. To mitigate this issue, a popular approach is to utilize the replay-based strategy, which stores a small portion of past data and replays it when learning new tasks. However, selecting valuable samples from previous classes for replaying remains an open problem in class incremental learning. In this paper, we propose a novel sample selection strategy aimed at maintaining effective samples from old classes to address the catastrophic forgetting issue. Specifically, we employ the influence function to evaluate the impact of each sample on model performance, and then select important samples for replay. However, given the potential redundancy among selected samples when only considering importance, we also develop a diversity strategy to select not only important but also diverse samples from old classes. We conduct extensive empirical validations on the CIFAR10 and CIFAR100 datasets and the results demonstrate that our proposed method outperforms the baselines, effectively alleviating the catastrophic forgetting issue in class incremental learning.
AB - Class incremental learning (CIL) has been attracting increasing attention in computer vision and machine learning communities, where a well-known issue is catastrophic forgetting. To mitigate this issue, a popular approach is to utilize the replay-based strategy, which stores a small portion of past data and replays it when learning new tasks. However, selecting valuable samples from previous classes for replaying remains an open problem in class incremental learning. In this paper, we propose a novel sample selection strategy aimed at maintaining effective samples from old classes to address the catastrophic forgetting issue. Specifically, we employ the influence function to evaluate the impact of each sample on model performance, and then select important samples for replay. However, given the potential redundancy among selected samples when only considering importance, we also develop a diversity strategy to select not only important but also diverse samples from old classes. We conduct extensive empirical validations on the CIFAR10 and CIFAR100 datasets and the results demonstrate that our proposed method outperforms the baselines, effectively alleviating the catastrophic forgetting issue in class incremental learning.
KW - Catastrophic forgetting
KW - Class incremental learning
KW - Diversity
KW - Influence function
UR - http://www.scopus.com/inward/record.url?scp=85177453437&partnerID=8YFLogxK
U2 - 10.1007/978-3-031-46314-3_13
DO - 10.1007/978-3-031-46314-3_13
M3 - Conference contribution
AN - SCOPUS:85177453437
SN - 9783031463136
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 164
EP - 175
BT - Image and Graphics - 12th International Conference, ICIG 2023, Proceedings
A2 - Lu, Huchuan
A2 - Liu, Risheng
A2 - Ouyang, Wanli
A2 - Huang, Hui
A2 - Lu, Jiwen
A2 - Dong, Jing
A2 - Xu, Min
PB - Springer Science and Business Media Deutschland GmbH
T2 - 12th International Conference on Image and Graphics, ICIG 2023
Y2 - 22 September 2023 through 24 September 2023
ER -