Toward Secure and Robust Federated Distillation in Distributed Cloud: Challenges and Design Issues

Xiaodong Wang, Zhitao Guan, Longfei Wu, Keke Gai*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

Federated learning (FL) offers a promising solution for effectively leveraging the data scattered across the distributed cloud system. Despite its potential, the huge communication overhead greatly burdens the distributed cloud system. Federated distillation (FD) is a novel distributed learning technique with low communication cost, in which the clients communicate only the model logits rather than the model parameters. However, FD faces challenges related to data heterogeneity and security. Additionally, the conventional aggregation method in FD is vulnerable to malicious uploads. In this article, we discuss the limitations of FL and the challenges of FD in the context of distributed cloud system. To address these issues, we propose a blockchain-based framework to achieve secure and robust FD. Specifically, we develop a pre-training data preparation method to reduce data distribution heterogeneity and an aggregation method to enhance the robustness of the aggregation process. Moreover, a committee/workers selection strategy is devised to optimize the task allocation among clients. Experimental evaluations are conducted to evaluate the effectiveness of the proposed framework.

Original languageEnglish
Pages (from-to)151-157
Number of pages7
JournalIEEE Network
Volume38
Issue number4
DOIs
Publication statusPublished - 2024
Externally publishedYes

Fingerprint

Dive into the research topics of 'Toward Secure and Robust Federated Distillation in Distributed Cloud: Challenges and Design Issues'. Together they form a unique fingerprint.

Cite this