Enabling Efficient and Strong Privacy-Preserving Truth Discovery in Mobile Crowdsensing

Chuan Zhang, Mingyang Zhao, Liehuang Zhu*, Tong Wu*, Ximeng Liu

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

32 Citations (Scopus)

Abstract

Mobile crowdsensing has emerged as a popular platform to solve many challenging problems by utilizing users' wisdom and resources. Due to user diversity, the data provided by different individuals may vary significantly, and thus it is important to analyze data quality during data aggregation. Truth discovery is effective in capturing data quality and obtaining accurate mobile crowdsensing results. Existing works on truth discovery either cannot protect both task privacy and data privacy, or introduce tremendous computational costs. In this paper, we propose an efficient and strong privacy-preserving truth discovery scheme, named EPTD, to protect users' task privacy and data privacy simultaneously in the truth discovery procedure. In EPTD, we first exploit the randomizable matrix to express users' tasks and sensory data. Then, based on the matrix computation properties, we design key derivation and (re-)encryption mechanisms to enable truth discovery to be performed in an efficient and privacy-preserving manner. Through a detailed security analysis, we demonstrate that data privacy and task privacy are well preserved. Extensive experiments based on real-world and simulated mobile crowdsensing applications show EPTD has practical efficiency in terms of computational cost and communication overhead.

Original languageEnglish
Pages (from-to)3569-3581
Number of pages13
JournalIEEE Transactions on Information Forensics and Security
Volume17
DOIs
Publication statusPublished - 2022

Keywords

  • Mobile crowdsensing
  • efficiency
  • privacy preservation
  • truth discovery

Fingerprint

Dive into the research topics of 'Enabling Efficient and Strong Privacy-Preserving Truth Discovery in Mobile Crowdsensing'. Together they form a unique fingerprint.

Cite this