TY - GEN
T1 - Fed-AttGRU Privacy-preserving Federated Interest Recommendation
AU - Wan, Jun
AU - Chi, Cheng
AU - Yu, Haoyuan
AU - Liu, Yang
AU - Xu, Xiangrui
AU - Lyu, Hongmei
AU - Wang, Wei
N1 - Publisher Copyright:
© 2024 ACM.
PY - 2024/7/5
Y1 - 2024/7/5
N2 - Accurately predicting the next point of interest (NPOI) for trains in railway transportation is crucial for optimizing train schedules and routes. However, the check-in data used for modeling is sparse, making it challenging to model and predict preferences effectively. Additionally, railway location data is susceptible, rendering traditional centralized training methods unsuitable. Therefore, we introduce a recommendation method under high data sparsity with privacy protection - Fed-AttGRU. Specifically, Fed-AttGRU utilizes Gated Recurrent Unit (GRU) and attention mechanisms to construct a trajectory prediction mechanism that can integrate both short-term and long-term preferences. The sequence model built under this mechanism can effectively capture sparse data. At the same time, Fed-AttGRU combines federated learning with differential privacy, enabling collaborative modeling without the trajectory data leaving local devices, thereby avoiding privacy leakage issues associated with centralized storage. Based on federated learning, differential privacy mechanisms add noise to model parameters, preventing inference attacks from malicious servers and further balancing privacy protection and recommendation performance. Experiments on the Foursquare-NYC and Foursquare-TKY datasets demonstrate the effectiveness of this method in balancing privacy and recommendation performance.
AB - Accurately predicting the next point of interest (NPOI) for trains in railway transportation is crucial for optimizing train schedules and routes. However, the check-in data used for modeling is sparse, making it challenging to model and predict preferences effectively. Additionally, railway location data is susceptible, rendering traditional centralized training methods unsuitable. Therefore, we introduce a recommendation method under high data sparsity with privacy protection - Fed-AttGRU. Specifically, Fed-AttGRU utilizes Gated Recurrent Unit (GRU) and attention mechanisms to construct a trajectory prediction mechanism that can integrate both short-term and long-term preferences. The sequence model built under this mechanism can effectively capture sparse data. At the same time, Fed-AttGRU combines federated learning with differential privacy, enabling collaborative modeling without the trajectory data leaving local devices, thereby avoiding privacy leakage issues associated with centralized storage. Based on federated learning, differential privacy mechanisms add noise to model parameters, preventing inference attacks from malicious servers and further balancing privacy protection and recommendation performance. Experiments on the Foursquare-NYC and Foursquare-TKY datasets demonstrate the effectiveness of this method in balancing privacy and recommendation performance.
KW - Differential privacy
KW - Federated learning
KW - Next point of interest recommendation
UR - http://www.scopus.com/inward/record.url?scp=85200830671&partnerID=8YFLogxK
U2 - 10.1145/3674399.3674450
DO - 10.1145/3674399.3674450
M3 - Conference contribution
AN - SCOPUS:85200830671
T3 - ACM International Conference Proceeding Series
SP - 138
EP - 143
BT - Proceedings of ACM Turing Award Celebration Conference - CHINA 2024, TURC 2024
PB - Association for Computing Machinery
T2 - 2024 ACM Turing Award Celebration Conference China, TURC 2024
Y2 - 5 July 2024 through 7 July 2024
ER -