Personalized differential privacy graph neural network

Yanli Yuan, Dian Lei, Chuan Zhang, Zehui Xiong, Chunhai Li, Liehuang Zhu

Research output: Contribution to journalLetterpeer-review

Abstract

Dear Editor, This letter addresses the critical challenge of preserving privacy in graph learning without compromising on data utility. Differential privacy (DP) is emerging as an effective method for privacy-preserving graph learning. However, its application often diminishes data utility, especially for nodes with fewer neighbors in graph neural networks (GNNs). Given that most real-world graph data follow a power-law distribution with a majority of low-degree nodes, we propose PDPGNN, a novel GNN training method. The novelty of PDPGNN lies in uniquely offering personalized differential privacy by allocating privacy budgets based on node degrees, effectively improving the data utility for nodes with fewer connections. Additionally, PDPGNN integrates a weighted aggregation mechanism to enhance model accuracy. Theoretical analysis shows that PDPGNN can achieve e-differential privacy for graph data, making a balance between privacy protection and data utility. Experimental results on four real-world graph datasets demonstrate the effectiveness of PDPGNN.

Original languageEnglish
JournalIEEE/CAA Journal of Automatica Sinica
DOIs
Publication statusAccepted/In press - 2025

Fingerprint

Dive into the research topics of 'Personalized differential privacy graph neural network'. Together they form a unique fingerprint.

Cite this