TY - JOUR
T1 - A Novel Sleep Mechanism Inspired Continual Learning Algorithm
AU - Han, Yuyang
AU - Li, Xiuxing
AU - Jia, Tianyuan
AU - Wang, Qixin
AU - Fan, Chaoqiong
AU - Wu, Xia
N1 - Publisher Copyright:
© 2024 Technical Committee on Guidance, Navigation and Control, CSAA.
PY - 2024/8/31
Y1 - 2024/8/31
N2 - Bayesian-based methods have emerged as an effective approach in continual learning (CL) to solve catastrophic forgetting. One prominent example is Variational Continual Learning (VCL), which demonstrates remarkable performance in task-incremental learning (task-IL). However, class-incremental learning (class-IL) is still challenging for VCL, and the reasons behind this limitation remain unclear. Relying on the sophisticated neural mechanisms, particularly the mechanism of memory consolidation during sleep, the human brain possesses inherent advantages for both task-IL and class-IL scenarios, which provides insight for a brain-inspired VCL. To identify the reasons for the inadequacy of VCL in class-IL, we first conduct a comprehensive theoretical analysis of VCL. On this basis, we propose a novel Bayesian framework named as Learning within Sleeping (LwS) by leveraging the memory consolidation. By simulating the distribution integration and generalization observed during memory consolidation in sleep, LwS achieves the idea of prior knowledge guiding posterior knowledge learning as in VCL. In addition, with emulating the process of memory reactivation of the brain, LwS imposes a constraint on feature invariance to mitigate forgetting learned knowledge. Experimental results demonstrate that LwS outperforms both Bayesian and non-Bayesian methods in task-IL and class-IL scenarios, which further indicates the effectiveness of incorporating brain mechanisms on designing novel approaches for CL.
AB - Bayesian-based methods have emerged as an effective approach in continual learning (CL) to solve catastrophic forgetting. One prominent example is Variational Continual Learning (VCL), which demonstrates remarkable performance in task-incremental learning (task-IL). However, class-incremental learning (class-IL) is still challenging for VCL, and the reasons behind this limitation remain unclear. Relying on the sophisticated neural mechanisms, particularly the mechanism of memory consolidation during sleep, the human brain possesses inherent advantages for both task-IL and class-IL scenarios, which provides insight for a brain-inspired VCL. To identify the reasons for the inadequacy of VCL in class-IL, we first conduct a comprehensive theoretical analysis of VCL. On this basis, we propose a novel Bayesian framework named as Learning within Sleeping (LwS) by leveraging the memory consolidation. By simulating the distribution integration and generalization observed during memory consolidation in sleep, LwS achieves the idea of prior knowledge guiding posterior knowledge learning as in VCL. In addition, with emulating the process of memory reactivation of the brain, LwS imposes a constraint on feature invariance to mitigate forgetting learned knowledge. Experimental results demonstrate that LwS outperforms both Bayesian and non-Bayesian methods in task-IL and class-IL scenarios, which further indicates the effectiveness of incorporating brain mechanisms on designing novel approaches for CL.
KW - Bayesian inference
KW - brain-inspired algorithm
KW - Continual learning
KW - variational inference
UR - http://www.scopus.com/inward/record.url?scp=85199905548&partnerID=8YFLogxK
U2 - 10.1142/S2737480724410036
DO - 10.1142/S2737480724410036
M3 - Article
AN - SCOPUS:85199905548
SN - 2737-4807
VL - 4
JO - Guidance, Navigation and Control
JF - Guidance, Navigation and Control
IS - 3
M1 - 2441003
ER -