TY - GEN
T1 - Shuffled Differentially Private Federated Learning for Time Series Data Analytics
AU - Huang, Chenxi
AU - Jiang, Chaoyang
AU - Chen, Zhenghua
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - Trustworthy federated learning aims to achieve optimal performance while ensuring clients' privacy. Existing privacy-preserving federated learning approaches are mostly tailored for image data, lacking applications for time series data, which have many important applications, like machine health monitoring, human activity recognition, etc. Furthermore, protective noising on a time series data analytics model can significantly interfere with temporal-dependent learning, leading to a greater decline in accuracy. To address these issues, we develop a privacy-preserving federated learning algorithm for time series data. Specifically, we employ local differential privacy to extend the privacy protection trust boundary to the clients. We also incorporate shuffle techniques to achieve a privacy amplification, mitigating the accuracy decline caused by leveraging local differential privacy. Extensive experiments were conducted on five time series datasets. The evaluation results reveal that our algorithm experienced minimal accuracy loss compared to non-private federated learning in both small and large client scenarios. Under the same level of privacy protection, our algorithm demonstrated improved accuracy compared to the centralized differentially private federated learning in both scenarios.
AB - Trustworthy federated learning aims to achieve optimal performance while ensuring clients' privacy. Existing privacy-preserving federated learning approaches are mostly tailored for image data, lacking applications for time series data, which have many important applications, like machine health monitoring, human activity recognition, etc. Furthermore, protective noising on a time series data analytics model can significantly interfere with temporal-dependent learning, leading to a greater decline in accuracy. To address these issues, we develop a privacy-preserving federated learning algorithm for time series data. Specifically, we employ local differential privacy to extend the privacy protection trust boundary to the clients. We also incorporate shuffle techniques to achieve a privacy amplification, mitigating the accuracy decline caused by leveraging local differential privacy. Extensive experiments were conducted on five time series datasets. The evaluation results reveal that our algorithm experienced minimal accuracy loss compared to non-private federated learning in both small and large client scenarios. Under the same level of privacy protection, our algorithm demonstrated improved accuracy compared to the centralized differentially private federated learning in both scenarios.
UR - http://www.scopus.com/inward/record.url?scp=85173618112&partnerID=8YFLogxK
U2 - 10.1109/ICIEA58696.2023.10241529
DO - 10.1109/ICIEA58696.2023.10241529
M3 - Conference contribution
AN - SCOPUS:85173618112
T3 - Proceedings of the 18th IEEE Conference on Industrial Electronics and Applications, ICIEA 2023
SP - 1023
EP - 1028
BT - Proceedings of the 18th IEEE Conference on Industrial Electronics and Applications, ICIEA 2023
A2 - Cai, Wenjian
A2 - Yang, Guilin
A2 - Qiu, Jun
A2 - Gao, Tingting
A2 - Jiang, Lijun
A2 - Zheng, Tianjiang
A2 - Wang, Xinli
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 18th IEEE Conference on Industrial Electronics and Applications, ICIEA 2023
Y2 - 18 August 2023 through 22 August 2023
ER -