TY - JOUR
T1 - A federated learning method based on class prototype guided classifier for long-tailed data
AU - Li, Yang
AU - Liu, Xin
AU - Li, Kan
N1 - Publisher Copyright:
© The Author(s), under exclusive licence to Springer-Verlag London Ltd., part of Springer Nature 2024.
PY - 2024/12
Y1 - 2024/12
N2 - In federated learning, training on long-tailed data frequently leads to biased classifiers due to a significant imbalance in the number of samples between majority and minority classes. Prototype-based methods have been proven effective in capturing underlying representations in federated learning, contributing to performance improvements by recent studies. However, the class prototypes can be influenced by sample size, class size, and the number of iterations. In this work, we propose a prototype-based federated learning method for long-tailed data by retraining classifiers. This involves freezing the global prototypes aggregated from local prototypes and using them as a regularization term to guide local training. Compared to previous prototype-based methods, our approach focuses on the expression differences of class prototypes in long-tailed data, reduces the classifier’s reliance on the majority class samples, and can concentrate more on minority classes. Notably, our method does not introduce extra parameters and communication costs. We conduct experiments on image classification tasks under various settings, and our method outperforms all baselines in terms of performance.
AB - In federated learning, training on long-tailed data frequently leads to biased classifiers due to a significant imbalance in the number of samples between majority and minority classes. Prototype-based methods have been proven effective in capturing underlying representations in federated learning, contributing to performance improvements by recent studies. However, the class prototypes can be influenced by sample size, class size, and the number of iterations. In this work, we propose a prototype-based federated learning method for long-tailed data by retraining classifiers. This involves freezing the global prototypes aggregated from local prototypes and using them as a regularization term to guide local training. Compared to previous prototype-based methods, our approach focuses on the expression differences of class prototypes in long-tailed data, reduces the classifier’s reliance on the majority class samples, and can concentrate more on minority classes. Notably, our method does not introduce extra parameters and communication costs. We conduct experiments on image classification tasks under various settings, and our method outperforms all baselines in terms of performance.
KW - Class prototypes
KW - Deep learning
KW - Federated learning
KW - Long-tailed data
UR - http://www.scopus.com/inward/record.url?scp=85201955518&partnerID=8YFLogxK
U2 - 10.1007/s11760-024-03525-2
DO - 10.1007/s11760-024-03525-2
M3 - Article
AN - SCOPUS:85201955518
SN - 1863-1703
VL - 18
SP - 8999
EP - 9007
JO - Signal, Image and Video Processing
JF - Signal, Image and Video Processing
IS - 12
ER -