TY - JOUR
T1 - Efficient and convergent preconditioned admm for the potts models
AU - Sun, Hongpeng
AU - Tai, Xue Cheng
AU - Yuan, Jing
N1 - Funding Information:
\ast Submitted to the journal's Computational Methods in Science and Engineering section June 8, 2020; accepted for publication (in revised form) December 15, 2020; published electronically March 29, 2021. https://doi.org/10.1137/20M1343956 \bfF \bfu \bfn \bfd \bfi \bfn \bfg : The work of the first author was supported by the National Natural Science Foundation of China under grant 11701563, by the Alexander von Humboldt Foundation, and by the China Scholarship Council (CSC) under grant 201906365017. The work of the third author was supported by the National Natural Science Foundation of China under grant 61877047. \dagger Institute for Mathematical Sciences, Renmin University of China, 100872, Beijing, China ([email protected]). \ddagger Department of Mathematics, Hong Kong Baptist University, Kowloon Tong, Hong Kong ([email protected]). \S School of Mathematics and Statistics, Xidian University, Xi'an, 710076, Shaanxi, China ([email protected]).
Publisher Copyright:
© 2021 Society for Industrial and Applied Mathematics Publications. All rights reserved.
PY - 2021/4
Y1 - 2021/4
N2 - Nowadays the Potts model works as the key model in a broad spectrum of applications in image processing and computer vision, which can be mathematically formulated in the form of min-cuts and, meanwhile, solved in terms of flow maximizing under the perspective of primal and dual. It is of great interest in developing efficient methods with better algorithmic structures and proved convergence, which is, however, still lost and open for the classical augmented Lagrangian method (ALM)-based approaches. In this work, we propose two novel preconditioned and overrelaxed alternating direction methods of multipliers (ADMMs) with guaranteed convergence, which are based on the classical Eckstein-Bertsekas and Fortin-Glowinski splitting techniques. Particularly, the two new algorithms are essentially accelerated with the proposed preconditioners and overrelaxation schemes. We explore the proposed preconditioned overrelaxed ADMM methods for image segmentation; experiment results demonstrate the proposed methods significantly outperform the classical ALM-based algorithms in terms of superior numerical efficiency along with proved convergence.
AB - Nowadays the Potts model works as the key model in a broad spectrum of applications in image processing and computer vision, which can be mathematically formulated in the form of min-cuts and, meanwhile, solved in terms of flow maximizing under the perspective of primal and dual. It is of great interest in developing efficient methods with better algorithmic structures and proved convergence, which is, however, still lost and open for the classical augmented Lagrangian method (ALM)-based approaches. In this work, we propose two novel preconditioned and overrelaxed alternating direction methods of multipliers (ADMMs) with guaranteed convergence, which are based on the classical Eckstein-Bertsekas and Fortin-Glowinski splitting techniques. Particularly, the two new algorithms are essentially accelerated with the proposed preconditioners and overrelaxation schemes. We explore the proposed preconditioned overrelaxed ADMM methods for image segmentation; experiment results demonstrate the proposed methods significantly outperform the classical ALM-based algorithms in terms of superior numerical efficiency along with proved convergence.
KW - ADMM
KW - Block preconditioner
KW - Convex relaxation
KW - Douglas-Rachford splitting
KW - Image segmentation
KW - Potts model
UR - http://www.scopus.com/inward/record.url?scp=85105888371&partnerID=8YFLogxK
U2 - 10.1137/20M1343956
DO - 10.1137/20M1343956
M3 - Journal article
AN - SCOPUS:85105888371
SN - 1064-8275
VL - 43
SP - B455-B478
JO - SIAM Journal on Scientific Computing
JF - SIAM Journal on Scientific Computing
IS - 2
ER -