TY - GEN
T1 - Adaptive relaxed ADMM
T2 - 30th IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2017
AU - Xu, Zheng
AU - Figueiredo, Mário T.
AU - YUAN, Xiaoming
AU - Studer, Christoph
AU - Goldstein, Tom
N1 - Funding Information:
TG and ZX were supported by the US Office of Naval Research under grant N00014-17-1-2078 and by the US National Science Foundation (NSF) under grant CCF-1535902. MF was partially supported by the Fundac¸ão para a Ciência e Tecnologia, grant UID/EEA/5008/2013. XY was supported by the General Research Fund from Hong Kong Research Grants Council under grant HKBU-12313516. CS was supported in part by Xilinx Inc., and by the US NSF under grants ECCS-1408006, CCF-1535897, and CAREER CCF-1652065.
Funding Information:
TG and ZX were supported by the US Office of Naval Research under grant N00014-17-1-2078 and by the US National Science Foundation (NSF) under grant CCF-1535902. MF was partially supported by the Fundação para a Ciência e Tecnologia, grant UID/EEA/5008/2013. XY was supported by the General Research Fund from Hong Kong Research Grants Council under grant HKBU-12313516. CS was supported in part by Xilinx Inc., and by the US NSF under grants ECCS-1408006, CCF-1535897, and CAREER CCF-1652065.
PY - 2017/11/6
Y1 - 2017/11/6
N2 - Many modern computer vision and machine learning applications rely on solving difficult optimization problems that involve non-differentiable objective functions and constraints. The alternating direction method of multipliers (ADMM) is a widely used approach to solve such problems. Relaxed ADMM is a generalization of ADMM that often achieves better performance, but its efficiency depends strongly on algorithm parameters that must be chosen by an expert user. We propose an adaptive method that automatically tunes the key algorithm parameters to achieve optimal performance without user oversight. Inspired by recent work on adaptivity, the proposed adaptive relaxed ADMM (ARADMM) is derived by assuming a Barzilai-Borwein style linear gradient. A detailed convergence analysis of ARADMM is provided, and numerical results on several applications demonstrate fast practical convergence.
AB - Many modern computer vision and machine learning applications rely on solving difficult optimization problems that involve non-differentiable objective functions and constraints. The alternating direction method of multipliers (ADMM) is a widely used approach to solve such problems. Relaxed ADMM is a generalization of ADMM that often achieves better performance, but its efficiency depends strongly on algorithm parameters that must be chosen by an expert user. We propose an adaptive method that automatically tunes the key algorithm parameters to achieve optimal performance without user oversight. Inspired by recent work on adaptivity, the proposed adaptive relaxed ADMM (ARADMM) is derived by assuming a Barzilai-Borwein style linear gradient. A detailed convergence analysis of ARADMM is provided, and numerical results on several applications demonstrate fast practical convergence.
UR - http://www.scopus.com/inward/record.url?scp=85044504783&partnerID=8YFLogxK
U2 - 10.1109/CVPR.2017.765
DO - 10.1109/CVPR.2017.765
M3 - Conference proceeding
AN - SCOPUS:85044504783
T3 - Proceedings - 30th IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2017
SP - 7234
EP - 7243
BT - Proceedings - 30th IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2017
PB - IEEE
Y2 - 21 July 2017 through 26 July 2017
ER -