Efficient private SCO for heavy-tailed data via averaged clipping

Chenhan Jin, Kaiwen Zhou, Bo Han, James Cheng*, Tieyong Zeng*

*Corresponding author for this work

Research output: Contribution to journalJournal articlepeer-review

Abstract

We consider stochastic convex optimization for heavy-tailed data with the guarantee of being differentially private (DP). Most prior works on differentially private stochastic convex optimization for heavy-tailed data are either restricted to gradient descent (GD) or performed multi-times clipping on stochastic gradient descent (SGD), which is inefficient for large-scale problems. In this paper, we consider a one-time clipping strategy and provide principled analyses of its bias and private mean estimation. We establish new convergence results and improved complexity bounds for the proposed algorithm called AClipped-dpSGD for constrained and unconstrained convex problems. We also extend our convergent analysis to the strongly convex case and non-smooth case (which works for generalized smooth objectives with Ho¨lder-continuous gradients). All the above results are guaranteed with a high probability for heavy-tailed data. Numerical experiments are conducted to justify the theoretical improvement.

Original languageEnglish
Number of pages46
JournalMachine Learning
DOIs
Publication statusE-pub ahead of print - 30 Sept 2024

Scopus Subject Areas

  • Software
  • Artificial Intelligence

User-Defined Keywords

  • Convergence analysis
  • Convex optimization
  • Differential privacy
  • Gradient clipping
  • Heavy-tailed data
  • Stochastic gradient descent

Fingerprint

Dive into the research topics of 'Efficient private SCO for heavy-tailed data via averaged clipping'. Together they form a unique fingerprint.

Cite this