Abstract
The alternating direction method of multipliers (ADMM) is an important tool for solving complex optimization problems, but it involves minimization sub-steps that are often difficult to solve efficiently. The Primal-Dual Hybrid Gradient (PDHG) method is a powerful alternative that often has simpler sub-steps than ADMM, thus producing lower complexity solvers. Despite the flexibility of this method, PDHG is often impractical because it requires the careful choice of multiple stepsize parameters. There is often no intuitive way to choose these parameters to maximize efficiency, or even achieve convergence. We propose self-adaptive stepsize rules that automatically tune PDHG parameters for optimal convergence. We rigorously analyze our methods, and identify convergence rates. Numerical experiments show that adaptive PDHG has strong advantages over non-adaptive methods in terms of both efficiency and simplicity for the user.
Original language | English |
---|---|
Title of host publication | 29th Annual Conference on Neural Information Processing Systems, NIPS 2015 |
Editors | C. Cortes, N. Lawrence, D. Lee, M. Sugiyama, R. Garnett |
Publisher | Neural Information Processing Systems Foundation |
Pages | 2089-2097 |
Number of pages | 9 |
ISBN (Print) | 9781510825024 |
Publication status | Published - Dec 2015 |
Event | 29th Annual Conference on Neural Information Processing Systems, NIPS 2015 - Montreal, Canada Duration: 7 Dec 2015 → 12 Dec 2015 https://neurips.cc/Conferences/2015 https://proceedings.neurips.cc/paper/2015 |
Publication series
Name | Advances in Neural Information Processing Systems |
---|---|
Volume | 28 |
ISSN (Print) | 1049-5258 |
Name | NeurIPS Proceedings |
---|
Conference
Conference | 29th Annual Conference on Neural Information Processing Systems, NIPS 2015 |
---|---|
Country/Territory | Canada |
City | Montreal |
Period | 7/12/15 → 12/12/15 |
Internet address |
Scopus Subject Areas
- Computer Networks and Communications
- Information Systems
- Signal Processing