TY - GEN
T1 - A Dropout-Resilient and Privacy-Preserving Framework for Federated Learning via Lightweight Masking
AU - Jiang, Yufeng
AU - Liu, Jianghua
AU - Xu, Chenhao
AU - Zuo, Cong
AU - Xu, Lei
AU - Lei, Jian
N1 - Publisher Copyright:
© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2026.
PY - 2026
Y1 - 2026
N2 - Federated Learning (FL) allows models to be trained across decentralized clients without exchanging raw data, enhancing privacy. Despite this advantage, privacy concerns persist because local gradients shared with an untrusted aggregation server could potentially leak sensitive information. To address this, we introduce a dropout-resilient and privacy-preserving framework for federated learning via lightweight masking (DRPFed), which leverages secure masking and a trusted third party (TTP). The approach utilizes the Diffie-Hellman key exchange protocol to create shared secret keys between clients and the TTP, which are then used to generate masks for obscuring local gradients before they are sent to the server. To guarantee accurate aggregation, the TTP provides the final client with a compensatory mask, ensuring that the combined masks cancel out. Additionally, if a client disconnects, the TTP reallocates the missing mask among the remaining active clients to preserve aggregation correctness. Experimental evaluations demonstrate that, unlike the standard FedAvg, our method maintains model accuracy while effectively handling client dropouts. The proposed solution successfully protects gradient privacy against honest-but-curious servers and malicious clients, all while upholding the reliability of federated model training.
AB - Federated Learning (FL) allows models to be trained across decentralized clients without exchanging raw data, enhancing privacy. Despite this advantage, privacy concerns persist because local gradients shared with an untrusted aggregation server could potentially leak sensitive information. To address this, we introduce a dropout-resilient and privacy-preserving framework for federated learning via lightweight masking (DRPFed), which leverages secure masking and a trusted third party (TTP). The approach utilizes the Diffie-Hellman key exchange protocol to create shared secret keys between clients and the TTP, which are then used to generate masks for obscuring local gradients before they are sent to the server. To guarantee accurate aggregation, the TTP provides the final client with a compensatory mask, ensuring that the combined masks cancel out. Additionally, if a client disconnects, the TTP reallocates the missing mask among the remaining active clients to preserve aggregation correctness. Experimental evaluations demonstrate that, unlike the standard FedAvg, our method maintains model accuracy while effectively handling client dropouts. The proposed solution successfully protects gradient privacy against honest-but-curious servers and malicious clients, all while upholding the reliability of federated model training.
KW - Federated Learning
KW - Privacy-Preserving
KW - Single-Mask Encryption
UR - https://www.scopus.com/pages/publications/105022070705
U2 - 10.1007/978-981-95-3543-9_16
DO - 10.1007/978-981-95-3543-9_16
M3 - Conference contribution
AN - SCOPUS:105022070705
SN - 9789819535422
T3 - Lecture Notes in Computer Science
SP - 293
EP - 312
BT - Information and Communications Security - 27th International Conference, ICICS 2025, Proceedings
A2 - Han, Jinguang
A2 - Chen, Liquan
A2 - Cheng, Guang
A2 - Xiang, Yang
A2 - Susilo, Willy
PB - Springer Science and Business Media Deutschland GmbH
T2 - 27th International Conference on Information and Communications Security, ICICS 2025
Y2 - 29 October 2025 through 31 October 2025
ER -