TY - GEN
T1 - Efficient Model Quality Evaluation in Federated Learning via Functional Encryption
AU - Xu, Ruichen
AU - He, Yang
AU - Wu, Yi
AU - Hu, Chenfei
AU - Pan, Zijie
AU - Zhu, Liehuang
AU - Zhang, Chuan
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - Federated Learning(FL) is a distributed machine learning paradigm that exchanges data among multiple parties without directly sharing the original data. However, FL faces the inherent issue of statistical heterogeneity. Recently, some privacy preserving federated learning scheme have considered this issue. But they use homomorphic encryption, which imposes significant computational and communication overhead on the clients. To address this issue, we propose an efficient model quality evaluation scheme in FL via functional encryption. Specifically, we first use inner product functional encryption(IPFE) to efficiently and securely calculate the cosine similarity between global update and each local update on the server, then use clustering algorithm to assign weights to each client based on cosine similarity, and finally update the global model through weighted aggregation. Experimental results show that compared with other model quality evaluation scheme, our approach increases the computational efficiency by up to 25% and reduces communication cost by up to 70%.
AB - Federated Learning(FL) is a distributed machine learning paradigm that exchanges data among multiple parties without directly sharing the original data. However, FL faces the inherent issue of statistical heterogeneity. Recently, some privacy preserving federated learning scheme have considered this issue. But they use homomorphic encryption, which imposes significant computational and communication overhead on the clients. To address this issue, we propose an efficient model quality evaluation scheme in FL via functional encryption. Specifically, we first use inner product functional encryption(IPFE) to efficiently and securely calculate the cosine similarity between global update and each local update on the server, then use clustering algorithm to assign weights to each client based on cosine similarity, and finally update the global model through weighted aggregation. Experimental results show that compared with other model quality evaluation scheme, our approach increases the computational efficiency by up to 25% and reduces communication cost by up to 70%.
KW - Federated Learning
KW - Functional Encryption
KW - Model Quality Evaluation
KW - Privacy-Preserving
UR - http://www.scopus.com/inward/record.url?scp=85216543255&partnerID=8YFLogxK
U2 - 10.1109/CSRSWTC64338.2024.10811496
DO - 10.1109/CSRSWTC64338.2024.10811496
M3 - Conference contribution
AN - SCOPUS:85216543255
T3 - Proceedings - 2024 Cross Strait Radio Science and Wireless Technology Conference, CSRSWTC 2024
BT - Proceedings - 2024 Cross Strait Radio Science and Wireless Technology Conference, CSRSWTC 2024
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2024 Cross Strait Radio Science and Wireless Technology Conference, CSRSWTC 2024
Y2 - 4 November 2024 through 7 November 2024
ER -