Efficient and convergent preconditioned admm for the potts models

Hongpeng Sun, Xue Cheng Tai, Jing Yuan

Research output: Contribution to journalJournal articlepeer-review

13 Citations (Scopus)
165 Downloads (Pure)

Abstract

Nowadays the Potts model works as the key model in a broad spectrum of applications in image processing and computer vision, which can be mathematically formulated in the form of min-cuts and, meanwhile, solved in terms of flow maximizing under the perspective of primal and dual. It is of great interest in developing efficient methods with better algorithmic structures and proved convergence, which is, however, still lost and open for the classical augmented Lagrangian method (ALM)-based approaches. In this work, we propose two novel preconditioned and overrelaxed alternating direction methods of multipliers (ADMMs) with guaranteed convergence, which are based on the classical Eckstein-Bertsekas and Fortin-Glowinski splitting techniques. Particularly, the two new algorithms are essentially accelerated with the proposed preconditioners and overrelaxation schemes. We explore the proposed preconditioned overrelaxed ADMM methods for image segmentation; experiment results demonstrate the proposed methods significantly outperform the classical ALM-based algorithms in terms of superior numerical efficiency along with proved convergence.

Original languageEnglish
Pages (from-to)B455-B478
Number of pages24
JournalSIAM Journal on Scientific Computing
Volume43
Issue number2
DOIs
Publication statusPublished - Apr 2021

Scopus Subject Areas

  • Computational Mathematics
  • Applied Mathematics

User-Defined Keywords

  • ADMM
  • Block preconditioner
  • Convex relaxation
  • Douglas-Rachford splitting
  • Image segmentation
  • Potts model

Fingerprint

Dive into the research topics of 'Efficient and convergent preconditioned admm for the potts models'. Together they form a unique fingerprint.

Cite this