Making Serverless Not so Cold in Edge Clouds: A Cost-Effective Online Approach

Ke Xiao, Song Yang*, Fan Li, Liehuang Zhu, Xu Chen, Xiaoming Fu

*此作品的通讯作者

科研成果: 期刊稿件文章同行评审

5 引用 (Scopus)

摘要

Applying the serverless paradigm to edge computing improves edge resource utilization while bringing the benefits of flexible scaling and pay-as-you-go to latency-sensitive applications. This extends the boundaries of serverless computing and improves the quality of service for Function-as-a-Service users. However, as an emerging cloud computing paradigm, serverless edge computing faces pressing challenges, with one of the biggest obstacles being delay caused by excessively long container cold starts. Cold start delay is defined as the time between when a serverless function is triggered and when it begins to execute, and its existence seriously impacts resource utilization and Quality of Service (QoS). In this article, we study how to minimize the total system cost by caching function containers and selecting routes for neighboring functions via edge or public clouds. We prove that the proposed problem is NP-hard even in the special case where the user request contains only one function, and that the unpredictability of user requests and the impact between adjacent time decisions require that the problem to be solved in an online fashion. We then design the Online Lazy Caching algorithm, an online algorithm with a worst-case competitive ratio using a randomized dependent rounding algorithm to solve the problem. Extensive simulation results show that the proposed online algorithm can achieve close-to-optimal performance in terms of both total cost and cold start cost compared to other existing algorithms, with average improvements of 31.6% and 51.7%.

源语言英语
页(从-至)8789-8802
页数14
期刊IEEE Transactions on Mobile Computing
23
9
DOI
出版状态已出版 - 2024

指纹

探究 'Making Serverless Not so Cold in Edge Clouds: A Cost-Effective Online Approach' 的科研主题。它们共同构成独一无二的指纹。

引用此