NeuronDP: Neuron-grained Differential Privacy of Deep Neural Networks in Edge-based Federated Learning

Wending Liu, Rui Han*, Xinyu Guo, Junyan Ouyang, Xiaojiang Zuo, Chi Harold Liu

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Introducing differential privacy (DP) in federated learning (FL) has become a very effective way to protect user privacy. The privacy constraint of differential privacy is implemented by adding noise to deep neural networks (DNN). Traditional approaches uniformly add noise to all parts of the neural network, overlooking the fact that different parts of the network have different importance to the loss function, leading to significant accuracy degradation. It is necessary to finely control the addition of noise through fine-grained methods. However, in the scenario of differential privacy federated learning, it is difficult to quantify the importance of each part of the neural network to the loss function, and it is difficult to make reasonable and effective privacy budget allocation. Moreover, federated learning often faces the problem of model heterogeneity, which poses challenges to fine-grained differential privacy federated learning. To address these challenges, we propose NeuronDP, a differential privacy federated learning framework that adds noise at the neuron level. NeuronDP defines a novel importance metric for the neuron and allocates the privacy budgets to each Neuron according to its importance. Finally, NeuronDP introduces a model distillation-based approach to enable it in FL across the heterogeneous models. We implement NeuronDP in PyTorch and extensively evaluate it against state-of-the-art techniques using popular FL benchmarks. The results showed that while keeping privacy protection and computation efficiency, NeuronDP improves the accuracy by an average of 81.8%, and improves the accuracy up to 411.8% when the privacy protection level is high.

Original languageEnglish
Title of host publication2024 IEEE 4th International Conference on Electronic Technology, Communication and Information, ICETCI 2024
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages166-171
Number of pages6
ISBN (Electronic)9798350361643
DOIs
Publication statusPublished - 2024
Event4th IEEE International Conference on Electronic Technology, Communication and Information, ICETCI 2024 - Changchun, China
Duration: 24 May 202426 May 2024

Publication series

Name2024 IEEE 4th International Conference on Electronic Technology, Communication and Information, ICETCI 2024

Conference

Conference4th IEEE International Conference on Electronic Technology, Communication and Information, ICETCI 2024
Country/TerritoryChina
CityChangchun
Period24/05/2426/05/24

Keywords

  • Differential Privacy
  • Federated Learning
  • Fine-grained
  • Privacy Security

Fingerprint

Dive into the research topics of 'NeuronDP: Neuron-grained Differential Privacy of Deep Neural Networks in Edge-based Federated Learning'. Together they form a unique fingerprint.

Cite this