Co-LDL: A Co-Training-Based Label Distribution Learning Method for Tackling Label Noise

Zeren Sun, Huafeng Liu, Qiong Wang*, Tianfei Zhou, Qi Wu, Zhenmin Tang

*此作品的通讯作者

科研成果: 期刊稿件文章同行评审

23 引用 (Scopus)

摘要

Performances of deep neural networks are prone to be degraded by label noise due to their powerful capability in fitting training data. Deeming low-loss instances as clean data is one of the most promising strategies in tackling label noise and has been widely adopted by state-of-the-art methods. However, prior works tend to drop high-loss instances directly, neglecting their valuable information. To address this issue, we propose an end-to-end framework named Co-LDL, which incorporates the low-loss sample selection strategy with label distribution learning. Specifically, we simultaneously train two deep neural networks and let them communicate useful knowledge by selecting low-loss and high-loss samples for each other. Low-loss samples are leveraged conventionally for updating network parameters. On the contrary, high-loss samples are trained in a label distribution learning manner to update network parameters and label distributions concurrently. Moreover, we propose a self-supervised module to further boost the model performance by enhancing the learned representations. Comprehensive experiments on both synthetic and real-world noisy datasets are provided to demonstrate the superiority of our Co-LDL method over state-of-the-art approaches in learning with noisy labels. The source code and models have been made available at https://github.com/NUST-Machine-Intelligence-Laboratory/CoLDL.

源语言英语
页(从-至)1093-1104
页数12
期刊IEEE Transactions on Multimedia
24
DOI
出版状态已出版 - 2022
已对外发布

指纹

探究 'Co-LDL: A Co-Training-Based Label Distribution Learning Method for Tackling Label Noise' 的科研主题。它们共同构成独一无二的指纹。

引用此