Weighted Contrastive Learning for Complementary Label learning

  • Jiayi Liu
  • , Yangyang Zhou
  • , Jiabin Liu*
  • , Yan Li
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Complementary label learning(CLL), which provides information only on what a sample does not belong to in a certain category, has garnered increasing attention in the field of machine learning. Existing deep learning-based CLL methods mainly consider conducting experiments on unstructured data, employing text or image datasets to measure the final performance. Notably, as the core data form in key fields such as finance and healthcare, tabular data contains both continuous and discrete categorical features, with complex feature distributions and hidden correlations. This characteristic makes it difficult for traditional CLL models to effectively capture the nonlinear relationships between tabular features, resulting in weak feature learning capabilities. Moreover, existing contrastive learning mechanisms have poor adaptability to tabular data. In response, this paper introduces a novel model for CLL tailored to handle tabular data with complementary labels. Specifically, we combine the CLL and contrastive learning together by a multi-task model, where both self-supervised and supervised contrastive learning are employed for auxiliary feature extraction. Besides, the self-supervised and supervised contrastive learning are constructed by different augment views, and the probability score of the pseudo-label is utilized as a weight coefficient to enhance the supervised contrastive loss function. Furthermore, in order to better fit the scenario of the tabular dataset, we have adopted a data augmentation mechanism targeting them. The experimental results strongly confirm that the method proposed in this study performs excellently in the scenario of complementary label learning: compared with existing methods, it achieves an average accuracy improvement of 0.73%-2.66% over existing methods, outperforming all comparative baselines on every dataset.

Original languageEnglish
Pages (from-to)4050-4071
Number of pages22
JournalKSII Transactions on Internet and Information Systems
Volume19
Issue number11
DOIs
Publication statusPublished - 30 Nov 2025
Externally publishedYes

Keywords

  • Complementary label learning
  • contrastive learning
  • tabular data

Fingerprint

Dive into the research topics of 'Weighted Contrastive Learning for Complementary Label learning'. Together they form a unique fingerprint.

Cite this