Dynamic Loss for Robust Learning

Shenwang Jiang, Jianan Li*, Jizhou Zhang, Ying Wang, Tingfa Xu

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

Label noise and class imbalance are common challenges encountered in real-world datasets. Existing approaches for robust learning often focus on addressing either label noise or class imbalance individually, resulting in suboptimal performance when both biases are present. To bridge this gap, this work introduces a novel meta-learning-based dynamic loss that adapts the objective functions during the training process to effectively learn a classifier from long-tailed noisy data. Specifically, our dynamic loss consists of two components: a label corrector and a margin generator. The label corrector is responsible for correcting noisy labels, while the margin generator generates per-class classification margins by capturing the underlying data distribution and the learning state of the classifier. In addition, we employ a hierarchical sampling strategy that enriches a small amount of unbiased metadata with diverse and challenging samples. This enables the joint optimization of the two components in the dynamic loss through meta-learning, allowing the classifier to effectively adapt to clean and balanced test data. Extensive experiments conducted on multiple real-world and synthetic datasets with various types of data biases, including CIFAR-10/100, Animal-10N, ImageNet-LT, and Webvision, demonstrate that our method achieves state-of-the-art accuracy.

Original languageEnglish
Pages (from-to)14420-14434
Number of pages15
JournalIEEE Transactions on Pattern Analysis and Machine Intelligence
Volume45
Issue number12
DOIs
Publication statusPublished - 1 Dec 2023

Keywords

  • Robust learning
  • class imbalance
  • label noise
  • meta learning

Fingerprint

Dive into the research topics of 'Dynamic Loss for Robust Learning'. Together they form a unique fingerprint.

Cite this