TY - JOUR
T1 - Federated Learning With Only Positive Labels by Exploring Label Correlations
AU - An, Xuming
AU - Wang, Dui
AU - Shen, Li
AU - Luo, Yong
AU - Hu, Han
AU - Du, Bo
AU - Wen, Yonggang
AU - Tao, Dacheng
N1 - Publisher Copyright:
© 2012 IEEE.
PY - 2025
Y1 - 2025
N2 - Federated learning (FL) aims to collaboratively learn a model by using the data from multiple users under privacy constraints. In this article, we study the multilabel classification (MLC) problem under the FL setting, where trivial solution and extremely poor performance may be obtained, especially when only positive data with respect to a single class label is provided for each client. This issue can be addressed by adding a specially designed regularizer on the server side. Although effective sometimes, the label correlations are simply ignored and thus suboptimal performance may be obtained. Besides, it is expensive and unsafe to exchange user’s private embeddings between server and clients frequently, especially when training model in the contrastive way. To remedy these drawbacks, we propose a novel and generic method termed federated averaging (FedAvg) by exploring label correlations (FedALCs). Specifically, FedALC estimates the label correlations in the class embedding learning for different label pairs and utilizes it to improve the model training. To further improve the safety and also reduce the communication overhead, we propose a variant to learn fixed class embedding for each client, so that the server and clients only need to exchange class embeddings once. Extensive experiments on multiple popular datasets demonstrate that our FedALC can significantly outperform the existing counterparts.
AB - Federated learning (FL) aims to collaboratively learn a model by using the data from multiple users under privacy constraints. In this article, we study the multilabel classification (MLC) problem under the FL setting, where trivial solution and extremely poor performance may be obtained, especially when only positive data with respect to a single class label is provided for each client. This issue can be addressed by adding a specially designed regularizer on the server side. Although effective sometimes, the label correlations are simply ignored and thus suboptimal performance may be obtained. Besides, it is expensive and unsafe to exchange user’s private embeddings between server and clients frequently, especially when training model in the contrastive way. To remedy these drawbacks, we propose a novel and generic method termed federated averaging (FedAvg) by exploring label correlations (FedALCs). Specifically, FedALC estimates the label correlations in the class embedding learning for different label pairs and utilizes it to improve the model training. To further improve the safety and also reduce the communication overhead, we propose a variant to learn fixed class embedding for each client, so that the server and clients only need to exchange class embeddings once. Extensive experiments on multiple popular datasets demonstrate that our FedALC can significantly outperform the existing counterparts.
KW - Federated learning (FL)
KW - fixed class embedding
KW - label correlation
KW - multilabel
KW - positive label
UR - http://www.scopus.com/inward/record.url?scp=105002287735&partnerID=8YFLogxK
U2 - 10.1109/TNNLS.2024.3396875
DO - 10.1109/TNNLS.2024.3396875
M3 - Article
C2 - 38743541
AN - SCOPUS:105002287735
SN - 2162-237X
VL - 36
SP - 7651
EP - 7665
JO - IEEE Transactions on Neural Networks and Learning Systems
JF - IEEE Transactions on Neural Networks and Learning Systems
IS - 4
ER -