TY - JOUR
T1 - CEFL
T2 - Online Admission Control, Data Scheduling, and Accuracy Tuning for Cost-Efficient Federated Learning across Edge Nodes
AU - Zhou, Zhi
AU - Yang, Song
AU - Pu, Lingjun
AU - Yu, Shuai
N1 - Publisher Copyright:
© 2014 IEEE.
PY - 2020/10
Y1 - 2020/10
N2 - With the proliferation of Internet of Things (IoT), zillions of bytes of data are generated at the network edge, incurring an urgent need to push the frontiers of artificial intelligence (AI) to network edge so as to fully unleash the potential of the IoT big data. To materialize such a vision which is known as edge intelligence, federated learning is emerging as a promising solution to enable edge nodes to collaboratively learn a shared model in a privacy-preserving and communication-efficient manner, by keeping the data at the edge nodes. While pilot efforts on federated learning have mostly focused on reducing the communication overhead, the computation efficiency of those resource-constrained edge nodes has been largely overlooked. To bridge this gap, in this article, we investigate how to coordinate the edge and the cloud to optimize the system-wide cost efficiency of federated learning. Leveraging the Lyapunov optimization theory, we design and analyze a cost-efficient optimization framework CEFL to make online yet near-optimal control decisions on admission control, load balancing, data scheduling, and accuracy tuning for the dynamically arrived training data samples, reducing both computation and communication cost. In particular, our control framework CEFL can be flexibly extended to incorporate various design choices and practical requirements of federated learning, such as exploiting the cheaper cloud resource for model training with better cost efficiency yet still facilitating on-demand privacy preservation. Via both rigorous theoretical analysis and extensive trace-driven evaluations, we verify the cost efficiency of our proposed CEFL framework.
AB - With the proliferation of Internet of Things (IoT), zillions of bytes of data are generated at the network edge, incurring an urgent need to push the frontiers of artificial intelligence (AI) to network edge so as to fully unleash the potential of the IoT big data. To materialize such a vision which is known as edge intelligence, federated learning is emerging as a promising solution to enable edge nodes to collaboratively learn a shared model in a privacy-preserving and communication-efficient manner, by keeping the data at the edge nodes. While pilot efforts on federated learning have mostly focused on reducing the communication overhead, the computation efficiency of those resource-constrained edge nodes has been largely overlooked. To bridge this gap, in this article, we investigate how to coordinate the edge and the cloud to optimize the system-wide cost efficiency of federated learning. Leveraging the Lyapunov optimization theory, we design and analyze a cost-efficient optimization framework CEFL to make online yet near-optimal control decisions on admission control, load balancing, data scheduling, and accuracy tuning for the dynamically arrived training data samples, reducing both computation and communication cost. In particular, our control framework CEFL can be flexibly extended to incorporate various design choices and practical requirements of federated learning, such as exploiting the cheaper cloud resource for model training with better cost efficiency yet still facilitating on-demand privacy preservation. Via both rigorous theoretical analysis and extensive trace-driven evaluations, we verify the cost efficiency of our proposed CEFL framework.
KW - Distributed learning
KW - edge computing
KW - edge intelligence
KW - federated learning
KW - online scheduling
UR - http://www.scopus.com/inward/record.url?scp=85089312230&partnerID=8YFLogxK
U2 - 10.1109/JIOT.2020.2984332
DO - 10.1109/JIOT.2020.2984332
M3 - Article
AN - SCOPUS:85089312230
SN - 2327-4662
VL - 7
SP - 9341
EP - 9356
JO - IEEE Internet of Things Journal
JF - IEEE Internet of Things Journal
IS - 10
M1 - 9051991
ER -