Adaptive primal-dual splitting methods for statistical learning and image processing

Thomas Goldstein, Min Li, Xiaoming YUAN

Research output: Contribution to journalConference articlepeer-review

30 Citations (Scopus)

Abstract

The alternating direction method of multipliers (ADMM) is an important tool for solving complex optimization problems, but it involves minimization sub-steps that are often difficult to solve efficiently. The Primal-Dual Hybrid Gradient (PDHG) method is a powerful alternative that often has simpler sub-steps than ADMM, thus producing lower complexity solvers. Despite the flexibility of this method, PDHG is often impractical because it requires the careful choice of multiple stepsize parameters. There is often no intuitive way to choose these parameters to maximize efficiency, or even achieve convergence. We propose self-adaptive stepsize rules that automatically tune PDHG parameters for optimal convergence. We rigorously analyze our methods, and identify convergence rates. Numerical experiments show that adaptive PDHG has strong advantages over non-adaptive methods in terms of both efficiency and simplicity for the user.

Original languageEnglish
Pages (from-to)2089-2097
Number of pages9
JournalAdvances in Neural Information Processing Systems
Volume2015-January
Publication statusPublished - 2015
Event29th Annual Conference on Neural Information Processing Systems, NIPS 2015 - Montreal, Canada
Duration: 7 Dec 201512 Dec 2015

Scopus Subject Areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Fingerprint

Dive into the research topics of 'Adaptive primal-dual splitting methods for statistical learning and image processing'. Together they form a unique fingerprint.

Cite this