Abstract
The continuous changes of the external environment lead to the performance regression of neural networksbased on traditional deep learning methods. Therefore, continual learning(CL) area gradually attracts the attention of more researchers. For edge intelligence, the CL model not only needs to overcome catastrophic forgetting, but also needs to face the huge challenge of severely limited resources. This challenge is mainly reflected in the lack of labeled resources and powerful devices. However, the existing classic CL methods usually rely on a large number of labeled samples to maintain the plasticity and stability, and the lack of labeled resources will lead to a significant accuracy drop. Meanwhile, in order to deal with the problem of insufficient annotation resources, semi-supervised learning methods often need to pay a large computational and memory overhead for higher accuracy. In response to these problems, a low-cost semi-supervised CL method named edge hierarchicalmemory learner (EdgeHML) is proposed. EdgeHML can effectively utilize a large number of unlabeled samples and a small number of labeled samples. It is based on a hierarchical memory pool, leverage multi-level storage structure to store and replay samples. EdgeHML implements the interaction between different levels through a combination of online and offline strategies. In addition, in order to further reduce the computational overhead for unlabeled samples, EdgeHML leverages a progressive learning method. It reduces the computation cycles of unlabeled samples by controlling the learning process. Experimental results show that on three semi-supervised CL tasks, EdgeHML can improve the model accuracy by up to 16.35% compared with the classic CL method, and the training iterations time can be reduced by more than 50% compared with semi-supervised methods. EdgeHML achieves a semi-supervised CL process with high performance and low overhead for edge intelligence.
| Translated title of the contribution | Hierarchical Memory Pool Based Edge Semi-supervised Continual Learning Method |
|---|---|
| Original language | Chinese (Traditional) |
| Pages (from-to) | 23-31 |
| Number of pages | 9 |
| Journal | Computer Science |
| Volume | 50 |
| Issue number | 2 |
| DOIs | |
| Publication status | Published - 15 Feb 2023 |
| Externally published | Yes |