Boosting Semi-Supervised Federated Learning with Model Personalization and Client-Variance-Reduction

Shuai Wang*, Yanqing Xu, Yanli Yuan, Xiuhua Wang, Tony Q.S. Quek

*此作品的通讯作者

科研成果: 期刊稿件会议文章同行评审

4 引用 (Scopus)

摘要

Recently, federated learning (FL) has been increasingly appealing in distributed signal processing and machine learning. Nevertheless, the practical challenges of label deficiency and client heterogeneity form a bottleneck to its wide adoption. Although numerous efforts have been devoted to semi- supervised FL, most of the adopted algorithms follow the same spirit as FedAvg, thus heavily suffering from the adverse effects caused by client heterogeneity. In this paper, we boost the semi-supervised FL by addressing the issue using model personalization and client-variance-reduction. In particular, we propose a novel and unified problem formulation based on pseudo-labeling and model interpolation. We then propose an effective algorithm, named FedCPSL, which judiciously adopts the schemes of a novel momentum-based client- variance-reduction and normalized averaging. Convergence property of FedCPSL is analyzed and shows that FedCPSL is resilient to client heterogeneity and obtains a sublinear convergence rate. Experimental results on image classification tasks are also presented to demonstrate the efficacy of FedCPSL over the benchmark algorithms.

指纹

探究 'Boosting Semi-Supervised Federated Learning with Model Personalization and Client-Variance-Reduction' 的科研主题。它们共同构成独一无二的指纹。

引用此