Differential Private Stochastic Optimization with Heavy-tailed Data: Towards Optimal Rates

Puning Zhao, Jiafei Wu*, Zhe Liu, Chong Wang, Rongfei Fan, Qingming Li

*Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

Abstract

We study convex optimization problems under differential privacy (DP). With heavy-tailed gradients, existing works achieve suboptimal rates. The main obstacle is that existing gradient estimators have suboptimal tail properties, resulting in a superfluous factor of d in the union bound. In this paper, we explore algorithms achieving optimal rates of DP optimization with heavy-tailed gradients. Our first method is a simple clipping approach. Under bounded p-th order moments of gradients, with n samples, it achieves Õ(pd/n + √d(d/nϵ)1−1/p) population risk with ϵ ≤ 1/d. We then propose an iterative updating method, which is more complex but achieves this rate for all ϵ ≤ 1. The results significantly improve over existing methods. Such improvement relies on a careful treatment of the tail behavior of gradient estimators. Our results match the minimax lower bound, indicating that the theoretical limit of stochastic convex optimization under DP is achievable.

Original languageEnglish
Pages (from-to)22795-22803
Number of pages9
JournalProceedings of the AAAI Conference on Artificial Intelligence
Volume39
Issue number21
DOIs
Publication statusPublished - 11 Apr 2025
Event39th Annual AAAI Conference on Artificial Intelligence, AAAI 2025 - Philadelphia, United States
Duration: 25 Feb 20254 Mar 2025

Fingerprint

Dive into the research topics of 'Differential Private Stochastic Optimization with Heavy-tailed Data: Towards Optimal Rates'. Together they form a unique fingerprint.

Cite this