Achieving Efficient and Privacy-Preserving Neural Network Training and Prediction in Cloud Environments

Chuan Zhang, Chenfei Hu, Tong Wu*, Liehuang Zhu*, Ximeng Liu

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

57 Citations (Scopus)

Abstract

The neural network has been widely used to train predictive models for applications such as image processing, disease prediction, and face recognition. To produce more accurate models, powerful third parties (e.g., clouds) are usually employed to collect data from a large number of users, which however may raise concerns about user privacy. In this paper, we propose an Efficient and Privacy-preserving Neural Network scheme, named EPNN, to deal with the privacy issues in cloud-based neural networks. EPNN is designed based on a two-cloud model and techniques of data perturbation and additively homomorphic cryptosystem. This scheme enables two clouds to cooperatively perform neural network training and prediction in a privacy-preserving manner and significantly reduces the computation and communication overhead among participating entities. Through a detailed analysis, we demonstrate the security of EPNN. Extensive experiments based on real-world datasets show EPNN is more efficient than existing schemes in terms of computational costs and communication overhead.

Original languageEnglish
Pages (from-to)4245-4257
Number of pages13
JournalIEEE Transactions on Dependable and Secure Computing
Volume20
Issue number5
DOIs
Publication statusPublished - 1 Sept 2023

Keywords

  • Privacy-preserving
  • additively homomorphic cryptosystem
  • cloud environments
  • data perturbation
  • neural network

Fingerprint

Dive into the research topics of 'Achieving Efficient and Privacy-Preserving Neural Network Training and Prediction in Cloud Environments'. Together they form a unique fingerprint.

Cite this