TY - GEN
T1 - Achieving Adaptive Privacy-Preserving Graph Neural Networks Training in Cloud Environment
AU - Yuan, Yanli
AU - Lei, Dian
AU - Fan, Qing
AU - Zhao, Keli
AU - Zhu, Liehuang
AU - Zhang, Chuan
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - With the widespread adoption of Graph Neural Network (GNN) technology in industry, concerns regarding graph data privacy have become increasingly prominent. Differential privacy has been demonstrated as an effective method to ensure privacy in graph learning. However, existing differential privacy-based GNN methods often overlook the individual privacy protection needs of users, offering uniform privacy guarantees to all. This approach can result in either over-protection or insufficient protection for certain users. To address this issue, we propose an adaptive privacy-preserving GNN training method that accommodates the varying privacy requirements of nodes while achieving high model training accuracy. Specifically, APPGNN allocates adaptive privacy budgets based on individual user privacy needs. Additionally, to mitigate the impact of noise on data utility, APPGNN incorporates a weighted neighborhood aggregation mechanism to enhance GNN model accuracy. Theoretical analysis indicates that APPGNN provides adaptive privacy protection while ensuring ϵ-differential privacy on node data. Experimental evaluations on four real-world graph datasets validate the effectiveness of APPGNN.
AB - With the widespread adoption of Graph Neural Network (GNN) technology in industry, concerns regarding graph data privacy have become increasingly prominent. Differential privacy has been demonstrated as an effective method to ensure privacy in graph learning. However, existing differential privacy-based GNN methods often overlook the individual privacy protection needs of users, offering uniform privacy guarantees to all. This approach can result in either over-protection or insufficient protection for certain users. To address this issue, we propose an adaptive privacy-preserving GNN training method that accommodates the varying privacy requirements of nodes while achieving high model training accuracy. Specifically, APPGNN allocates adaptive privacy budgets based on individual user privacy needs. Additionally, to mitigate the impact of noise on data utility, APPGNN incorporates a weighted neighborhood aggregation mechanism to enhance GNN model accuracy. Theoretical analysis indicates that APPGNN provides adaptive privacy protection while ensuring ϵ-differential privacy on node data. Experimental evaluations on four real-world graph datasets validate the effectiveness of APPGNN.
KW - adaptive
KW - cloud computing
KW - differential privacy
KW - Graph neural networks
KW - privacy-preserving
UR - http://www.scopus.com/inward/record.url?scp=85214527426&partnerID=8YFLogxK
U2 - 10.1109/ICICN62625.2024.10761771
DO - 10.1109/ICICN62625.2024.10761771
M3 - Conference contribution
AN - SCOPUS:85214527426
T3 - 2024 IEEE 12th International Conference on Information and Communication Networks, ICICN 2024
SP - 181
EP - 186
BT - 2024 IEEE 12th International Conference on Information and Communication Networks, ICICN 2024
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 12th IEEE International Conference on Information and Communication Networks, ICICN 2024
Y2 - 21 August 2024 through 24 August 2024
ER -