摘要
While deep learning has achieved remarkable results for text classification, incremental learning for text classification is still a challenge. The main problem is that models suffer from catastrophic forgetting, which is they always forget knowledge learned before when labelled data comes sequentially and is trained in sequence. In this study, we propose methods of preventing catastrophic forgetting to handle unbalanced increased data. As an improvement over experience replay, our approaches improve the accuracy about 23.3% with 23% of all training data on Yahoo and 9.5% with 12% of all training data and on DBPedia.
源语言 | 英语 |
---|---|
文章编号 | 012001 |
期刊 | Journal of Physics: Conference Series |
卷 | 2513 |
期 | 1 |
DOI | |
出版状态 | 已出版 - 2023 |
活动 | 2023 7th International Conference on Artificial Intelligence, Automation and Control Technologies, AIACT 2023 - Virtual, Online, 中国 期限: 24 2月 2023 → 26 2月 2023 |
指纹
探究 'Unbalanced Class-incremental Learning for Text Classification Based on Experience Replay' 的科研主题。它们共同构成独一无二的指纹。引用此
Chen, L., Zhang, H., Wushour, S., & Li, Y. (2023). Unbalanced Class-incremental Learning for Text Classification Based on Experience Replay. Journal of Physics: Conference Series, 2513(1), 文章 012001. https://doi.org/10.1088/1742-6596/2513/1/012001