TY - GEN
T1 - Interless
T2 - 36th Chinese Control and Decision Conference, CCDC 2024
AU - Ma, Ruifeng
AU - Zhan, Yufeng
AU - Yan, Tijin
AU - Xia, Yuanqing
AU - Ali, Yasir
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - Serverless is an emerging cloud computing paradigm that allows functions to share resources. However, function resource sharing introduces interference, which results in performance degradation. Existing resource prediction approaches ignore the function instance placement and interference between functions. Thus, they cannot predict the resource finely. This paper proposes Interless, an interference-aware resource prediction system for serverless computing with a sequence-to-sequence neural network. The Interless's encoder directly learns function instance interference by the TPA-LSTM module. TPA-LSTM can also capture historical request queuing for better prediction. Interless's decoder contains a GRU module for long-time series prediction. Long-time prediction is essential for time reservation in function scheduling and warm-up. Moreover, long-time series prediction helps Interless identify system anomalies and cyber threats by comparing monitored and predicted resource consumption. We implement Interless on top of Docker Swarm as a serverless system for resource prediction. Experimental results demonstrate that Interless reduces the MAPE, RSE, and SMAPE of prediction by 64%, 58%, and 65%, respectively, compared to the state-of-the-arts.
AB - Serverless is an emerging cloud computing paradigm that allows functions to share resources. However, function resource sharing introduces interference, which results in performance degradation. Existing resource prediction approaches ignore the function instance placement and interference between functions. Thus, they cannot predict the resource finely. This paper proposes Interless, an interference-aware resource prediction system for serverless computing with a sequence-to-sequence neural network. The Interless's encoder directly learns function instance interference by the TPA-LSTM module. TPA-LSTM can also capture historical request queuing for better prediction. Interless's decoder contains a GRU module for long-time series prediction. Long-time prediction is essential for time reservation in function scheduling and warm-up. Moreover, long-time series prediction helps Interless identify system anomalies and cyber threats by comparing monitored and predicted resource consumption. We implement Interless on top of Docker Swarm as a serverless system for resource prediction. Experimental results demonstrate that Interless reduces the MAPE, RSE, and SMAPE of prediction by 64%, 58%, and 65%, respectively, compared to the state-of-the-arts.
KW - anomaly detection
KW - deep learning
KW - resource prediction
KW - serverless computing
KW - time-series
UR - http://www.scopus.com/inward/record.url?scp=85200322943&partnerID=8YFLogxK
U2 - 10.1109/CCDC62350.2024.10588201
DO - 10.1109/CCDC62350.2024.10588201
M3 - Conference contribution
AN - SCOPUS:85200322943
T3 - Proceedings of the 36th Chinese Control and Decision Conference, CCDC 2024
SP - 3783
EP - 3788
BT - Proceedings of the 36th Chinese Control and Decision Conference, CCDC 2024
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 25 May 2024 through 27 May 2024
ER -