Abstract
We consider stochastic convex optimization for heavy-tailed data with the guarantee of being differentially private (DP). Most prior works on differentially private stochastic convex optimization for heavy-tailed data are either restricted to gradient descent (GD) or performed multi-times clipping on stochastic gradient descent (SGD), which is inefficient for large-scale problems. In this paper, we consider a one-time clipping strategy and provide principled analyses of its bias and private mean estimation. We establish new convergence results and improved complexity bounds for the proposed algorithm called AClipped-dpSGD for constrained and unconstrained convex problems. We also extend our convergent analysis to the strongly convex case and non-smooth case (which works for generalized smooth objectives with Ho¨lder-continuous gradients). All the above results are guaranteed with a high probability for heavy-tailed data. Numerical experiments are conducted to justify the theoretical improvement.
Original language | English |
---|---|
Number of pages | 46 |
Journal | Machine Learning |
DOIs | |
Publication status | E-pub ahead of print - 30 Sept 2024 |
Scopus Subject Areas
- Software
- Artificial Intelligence
User-Defined Keywords
- Convergence analysis
- Convex optimization
- Differential privacy
- Gradient clipping
- Heavy-tailed data
- Stochastic gradient descent