TY - JOUR
T1 - Adaptive primal-dual splitting methods for statistical learning and image processing
AU - Goldstein, Thomas
AU - Li, Min
AU - YUAN, Xiaoming
N1 - Funding Information:
This work was supported by the National Science Foundation (535902), the Office of Naval Research (#N00014-15-1-2676), and the Hong Kong Research Grants Council's General Research Fund (HKBU 12300515). The second author was supported in part by the Program for New Century Excellent University Talents under Grant No. NCET-12-0111, and the Qing Lan Project.
PY - 2015
Y1 - 2015
N2 - The alternating direction method of multipliers (ADMM) is an important tool for solving complex optimization problems, but it involves minimization sub-steps that are often difficult to solve efficiently. The Primal-Dual Hybrid Gradient (PDHG) method is a powerful alternative that often has simpler sub-steps than ADMM, thus producing lower complexity solvers. Despite the flexibility of this method, PDHG is often impractical because it requires the careful choice of multiple stepsize parameters. There is often no intuitive way to choose these parameters to maximize efficiency, or even achieve convergence. We propose self-adaptive stepsize rules that automatically tune PDHG parameters for optimal convergence. We rigorously analyze our methods, and identify convergence rates. Numerical experiments show that adaptive PDHG has strong advantages over non-adaptive methods in terms of both efficiency and simplicity for the user.
AB - The alternating direction method of multipliers (ADMM) is an important tool for solving complex optimization problems, but it involves minimization sub-steps that are often difficult to solve efficiently. The Primal-Dual Hybrid Gradient (PDHG) method is a powerful alternative that often has simpler sub-steps than ADMM, thus producing lower complexity solvers. Despite the flexibility of this method, PDHG is often impractical because it requires the careful choice of multiple stepsize parameters. There is often no intuitive way to choose these parameters to maximize efficiency, or even achieve convergence. We propose self-adaptive stepsize rules that automatically tune PDHG parameters for optimal convergence. We rigorously analyze our methods, and identify convergence rates. Numerical experiments show that adaptive PDHG has strong advantages over non-adaptive methods in terms of both efficiency and simplicity for the user.
UR - http://www.scopus.com/inward/record.url?scp=84965146424&partnerID=8YFLogxK
M3 - Conference article
AN - SCOPUS:84965146424
VL - 2015-January
SP - 2089
EP - 2097
JO - Advances in Neural Information Processing Systems
JF - Advances in Neural Information Processing Systems
SN - 1049-5258
T2 - 29th Annual Conference on Neural Information Processing Systems, NIPS 2015
Y2 - 7 December 2015 through 12 December 2015
ER -