Anchor Model-Based Hybrid Hierarchical Federated Learning with Overlap SGD

Ousman Manjang, Yanlong Zhai*, Jun Shen, Jude Tchaye-Kondi, Liehuang Zhu

*此作品的通讯作者

科研成果: 期刊稿件文章同行评审

摘要

Federated learning (FL) is a distributed machine learning framework where multiple clients collaboratively train a model without sharing their data. Despite advancements, traditional FL methods encounter challenges including communication overhead, extended latency, and slow convergence. To address these issues, this paper introduces Anchor-HHFL, a novel approach that combines the strengths of synchronous and asynchronous FL. Anchor-HHFL employs multi-tier edge servers which conduct partial model aggregation and reduce the frequency of communication with the central server. Anchor-HHFL implements a novel divergence control method through hierarchical pullback. It orchestrates the sequence of each client's stochastic gradient descent (SGD) updates to pull the locally trained models towards an anchor model, ensuring alignment and minimizing divergence. Simultaneously, a secondary process collects client models without disrupting their ongoing local computations and transmits them to edge servers, thereby overlapping computation with communication, substantially enhancing the training speed. Additionally, to effectively handle asynchronous updates across clusters, Anchor-HHFL uses a heuristic weight assignment for global aggregation, weighting clients' updates based on the degree of their divergence from the global model. Extensive experiments on MNIST and CIFAR-10 datasets demonstrate Anchor-HHFL's superiority, achieving up to 3× faster convergence and higher test accuracy compared to the baselines.

源语言英语
页(从-至)12540-12557
页数18
期刊IEEE Transactions on Mobile Computing
23
12
DOI
出版状态已出版 - 2024

指纹

探究 'Anchor Model-Based Hybrid Hierarchical Federated Learning with Overlap SGD' 的科研主题。它们共同构成独一无二的指纹。

引用此