A Correlatively Sparse Lagrange Multiplier Expression Relaxation for Polynomial Optimization

Zheng Qu, Xindong Tang*

*Corresponding author for this work

Research output: Contribution to journalJournal articlepeer-review

1 Citation (Scopus)

Abstract

In this paper, we consider polynomial optimization with correlative sparsity. We construct correlatively sparse Lagrange multiplier expressions (CS-LMEs) and propose CS-LME reformulations for polynomial optimization problems using the Karush–Kuhn–Tucker optimality conditions. Correlatively sparse sum-of-squares (CS-SOS) relaxations are applied to solve the CS-LME reformulation. We show that the CS-LME reformulation inherits the original correlative sparsity pattern, and the CS-SOS relaxation provides sharper lower bounds when applied to the CS-LME reformulation, compared with when it is applied to the original problem. Moreover, the convergence of our approach is guaranteed under mild conditions. In numerical experiments, our new approach usually finds the global optimal value (up to a negligible error) with a low relaxation order for cases where directly solving the problem fails to get an accurate approximation. Also, by properly exploiting the correlative sparsity, our CS-LME approach requires less computational time than the original LME approach to reach the same accuracy level.
Original languageEnglish
Pages (from-to)127-162
Number of pages36
JournalSIAM Journal on Optimization
Volume34
Issue number1
Early online date5 Jan 2024
DOIs
Publication statusPublished - Mar 2024

Scopus Subject Areas

  • Software
  • Theoretical Computer Science
  • Applied Mathematics

User-Defined Keywords

  • polynomial optimization
  • correlative sparsity
  • Lagrange multiplier expressions
  • Moment-SOS relaxations

Fingerprint

Dive into the research topics of 'A Correlatively Sparse Lagrange Multiplier Expression Relaxation for Polynomial Optimization'. Together they form a unique fingerprint.

Cite this