Accelerating Gossip-Based Deep Learning in Heterogeneous Edge Computing Platforms

Rui Han, Shilin Li, Xiangwei Wang, Chi Harold Liu*, Gaofeng Xin, Lydia Y. Chen

*此作品的通讯作者

科研成果: 期刊稿件文章同行评审

23 引用 (Scopus)

摘要

With the exponential growth of data created at the network edge, decentralized and Gossip-based training of deep learning (DL) models on edge computing (EC) gains tremendous research momentum, owing to its capability to learn from resource-strenuous edge nodes with limited network connectivity. Today's edge devices are extremely heterogeneous, e.g., hardware and software stacks, and result in high performance variation of training time and inducing extra delay to synchronize and converge. The large body of prior art accelerates DL, being data or model parallelization, via a centralized server, e.g., parameter server scheme, which may easily turn into the system bottleneck or single point of failure. In this artice, we propose EdgeGossip, a framework specifically designed to accelerate the training process of decentralized and Gossip-based DL training for heterogeneous EC platforms. EdgeGossip features on: (i) low performance variation among multiple EC platforms during iterative training, and (ii) accuracy-aware training to fastly obtain best possible model accuracy. We implement EdgeGossip based on popular Gossip algorithms and demonstrate its effectiveness using real-world DL workloads, i.e., considerably reducing model training time by an average of 2.70 times while only incurring accuracy losses of 0.78 percent.

源语言英语
文章编号9303468
页(从-至)1591-1602
页数12
期刊IEEE Transactions on Parallel and Distributed Systems
32
7
DOI
出版状态已出版 - 1 7月 2021

指纹

探究 'Accelerating Gossip-Based Deep Learning in Heterogeneous Edge Computing Platforms' 的科研主题。它们共同构成独一无二的指纹。

引用此