Evaluating Differential Privacy in Federated Continual Learning

Junyan Ouyang*, Rui Han, Chi Harold Liu

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

In recent years, the privacy-protecting framework Differential Privacy (DP) has achieved remarkable success and has been widely studied. However, there is a lack of work on DP in the area of Federated Continual Learning (FCL), which is a combination of Federated Learning (FL) and Continual Learning (CL). This paper presents a formal definition of DP-FCL and evaluates several DP-FCL methods based on Gaussian DP (GDP) and Individual DP (IDP). The experimental results indicate that gradient modification based CL strategies are not practical in DP-FCL. To the best of our knowledge, this is the first work to experimentally study DP-FCL, which can provide a reference for future research in this area.

Original languageEnglish
Title of host publication2023 IEEE 98th Vehicular Technology Conference, VTC 2023-Fall - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9798350329285
DOIs
Publication statusPublished - 2023
Event98th IEEE Vehicular Technology Conference, VTC 2023-Fall - Hong Kong, China
Duration: 10 Oct 202313 Oct 2023

Publication series

NameIEEE Vehicular Technology Conference
ISSN (Print)1550-2252

Conference

Conference98th IEEE Vehicular Technology Conference, VTC 2023-Fall
Country/TerritoryChina
CityHong Kong
Period10/10/2313/10/23

Keywords

  • continual learning
  • deep learning
  • differential privacy
  • federated continual learning
  • federated learning
  • privacy

Fingerprint

Dive into the research topics of 'Evaluating Differential Privacy in Federated Continual Learning'. Together they form a unique fingerprint.

Cite this