TY - JOUR
T1 - Feature Correlation-Guided Knowledge Transfer for Federated Self-Supervised Learning
AU - Liu, Yi
AU - Guo, Song
AU - Zhang, Jie
AU - Zhan, Yufeng
AU - Zhou, Qihua
AU - Wang, Yingchun
N1 - Publisher Copyright:
© 2012 IEEE.
PY - 2025
Y1 - 2025
N2 - Extensive attention has been paid to the application of self-supervised learning (SSL) approaches on federated learning (FL) to tackle the label scarcity problem. Previous works on federated SSL (FedSSL) generally fall into two categories: parameter-based model aggregation or data-based feature sharing to achieve knowledge transfer among multiple unlabeled clients. Despite the progress, they inevitably rely on some assumptions, such as homogeneous models or the existence of an additional public dataset, which hinder the universality of the training frameworks for more general scenarios (e.g., unlabeled clients with heterogeneous models). Therefore, in this article, we propose a novel and general method named federated self-supervised learning with feature-correlation-based aggregation (FedFoA) to tackle the above limitations. By exchanging feature correlation instead of model parameters or feature mappings, our approach reduces the discrepancies of local representations learning processes, thus promoting collaboration between heterogeneous clients. A factorization-based method is designed to extract the cross-feature relation matrix from local representations, which serves as a knowledge medium for the aggregation phase. We demonstrate that FedFoA is a heterogeneity-supportive and privacy-preserving training framework and can be easily compatible with state-of-the-art FedSSL methods. Extensive empirical experiments demonstrate our proposed approach outperforms the state-of-the-art methods by a significant margin.
AB - Extensive attention has been paid to the application of self-supervised learning (SSL) approaches on federated learning (FL) to tackle the label scarcity problem. Previous works on federated SSL (FedSSL) generally fall into two categories: parameter-based model aggregation or data-based feature sharing to achieve knowledge transfer among multiple unlabeled clients. Despite the progress, they inevitably rely on some assumptions, such as homogeneous models or the existence of an additional public dataset, which hinder the universality of the training frameworks for more general scenarios (e.g., unlabeled clients with heterogeneous models). Therefore, in this article, we propose a novel and general method named federated self-supervised learning with feature-correlation-based aggregation (FedFoA) to tackle the above limitations. By exchanging feature correlation instead of model parameters or feature mappings, our approach reduces the discrepancies of local representations learning processes, thus promoting collaboration between heterogeneous clients. A factorization-based method is designed to extract the cross-feature relation matrix from local representations, which serves as a knowledge medium for the aggregation phase. We demonstrate that FedFoA is a heterogeneity-supportive and privacy-preserving training framework and can be easily compatible with state-of-the-art FedSSL methods. Extensive empirical experiments demonstrate our proposed approach outperforms the state-of-the-art methods by a significant margin.
KW - Contrastive learning
KW - federated learning (FL)
KW - QR decomposition
KW - self-supervised learning (SSL)
UR - http://www.scopus.com/inward/record.url?scp=86000727274&partnerID=8YFLogxK
U2 - 10.1109/TNNLS.2025.3541642
DO - 10.1109/TNNLS.2025.3541642
M3 - Article
AN - SCOPUS:86000727274
SN - 2162-237X
JO - IEEE Transactions on Neural Networks and Learning Systems
JF - IEEE Transactions on Neural Networks and Learning Systems
ER -