TY - JOUR
T1 - Proactive Caching at the Wireless Edge
T2 - A Novel Predictive User Popularity-Aware Approach
AU - Wan, Yunye
AU - Chen, Peng
AU - Xia, Yunni
AU - Ma, Yong
AU - Zhu, Dongge
AU - Wang, Xu
AU - Liu, Hui
AU - Li, Weiling
AU - Niu, Xianhua
AU - Xu, Lei
AU - Dong, Yumin
N1 - Publisher Copyright:
© 2024 Tech Science Press. All rights reserved.
PY - 2024
Y1 - 2024
N2 - Mobile Edge Computing (MEC) is a promising technology that provides on-demand computing and efficient storage services as close to end users as possible. In an MEC environment, servers are deployed closer to mobile terminals to exploit storage infrastructure, improve content delivery efficiency, and enhance user experience. However, due to the limited capacity of edge servers, it remains a significant challenge to meet the changing, time-varying, and customized needs for highly diversified content of users. Recently, techniques for caching content at the edge are becoming popular for addressing the above challenges. It is capable of filling the communication gap between the users and content providers while relieving pressure on remote cloud servers. However, existing static caching strategies are still inefficient in handling the dynamics of the time-varying popularity of content and meeting users’ demands for highly diversified entity data. To address this challenge, we introduce a novel method for content caching over MEC, i.e., PRIME. It synthesizes a content popularity prediction model, which takes users’ stay time and their request traces as inputs, and a deep reinforcement learning model for yielding dynamic caching schedules. Experimental results demonstrate that PRIME, when tested upon the MovieLens 1M dataset for user request patterns and the Shanghai Telecom dataset for user mobility, outperforms its peers in terms of cache hit rates, transmission latency, and system cost.
AB - Mobile Edge Computing (MEC) is a promising technology that provides on-demand computing and efficient storage services as close to end users as possible. In an MEC environment, servers are deployed closer to mobile terminals to exploit storage infrastructure, improve content delivery efficiency, and enhance user experience. However, due to the limited capacity of edge servers, it remains a significant challenge to meet the changing, time-varying, and customized needs for highly diversified content of users. Recently, techniques for caching content at the edge are becoming popular for addressing the above challenges. It is capable of filling the communication gap between the users and content providers while relieving pressure on remote cloud servers. However, existing static caching strategies are still inefficient in handling the dynamics of the time-varying popularity of content and meeting users’ demands for highly diversified entity data. To address this challenge, we introduce a novel method for content caching over MEC, i.e., PRIME. It synthesizes a content popularity prediction model, which takes users’ stay time and their request traces as inputs, and a deep reinforcement learning model for yielding dynamic caching schedules. Experimental results demonstrate that PRIME, when tested upon the MovieLens 1M dataset for user request patterns and the Shanghai Telecom dataset for user mobility, outperforms its peers in terms of cache hit rates, transmission latency, and system cost.
KW - Mobile edge computing
KW - collaborative
KW - content caching
KW - deep reinforcement learning
KW - system average cost
UR - http://www.scopus.com/inward/record.url?scp=85195067978&partnerID=8YFLogxK
U2 - 10.32604/cmes.2024.048723
DO - 10.32604/cmes.2024.048723
M3 - Article
AN - SCOPUS:85195067978
SN - 1526-1492
VL - 140
SP - 1997
EP - 2017
JO - CMES - Computer Modeling in Engineering and Sciences
JF - CMES - Computer Modeling in Engineering and Sciences
IS - 2
ER -