TY - JOUR
T1 - A Correlatively Sparse Lagrange Multiplier Expression Relaxation for Polynomial Optimization
AU - Qu, Zheng
AU - Tang, Xindong
N1 - The first author was partially supported by NSFC Young Scientist Fund grant 12001458 and Hong Kong Research Grants Council General Research Fund grant 17317122. The second author was partially supported by the Start-up Fund P0038976/BD7L from The Hong Kong Polytechnic University.
PY - 2024/3
Y1 - 2024/3
N2 - In this paper, we consider polynomial optimization with correlative sparsity. We construct correlatively sparse Lagrange multiplier expressions (CS-LMEs) and propose CS-LME reformulations for polynomial optimization problems using the Karush–Kuhn–Tucker optimality conditions. Correlatively sparse sum-of-squares (CS-SOS) relaxations are applied to solve the CS-LME reformulation. We show that the CS-LME reformulation inherits the original correlative sparsity pattern, and the CS-SOS relaxation provides sharper lower bounds when applied to the CS-LME reformulation, compared with when it is applied to the original problem. Moreover, the convergence of our approach is guaranteed under mild conditions. In numerical experiments, our new approach usually finds the global optimal value (up to a negligible error) with a low relaxation order for cases where directly solving the problem fails to get an accurate approximation. Also, by properly exploiting the correlative sparsity, our CS-LME approach requires less computational time than the original LME approach to reach the same accuracy level.
AB - In this paper, we consider polynomial optimization with correlative sparsity. We construct correlatively sparse Lagrange multiplier expressions (CS-LMEs) and propose CS-LME reformulations for polynomial optimization problems using the Karush–Kuhn–Tucker optimality conditions. Correlatively sparse sum-of-squares (CS-SOS) relaxations are applied to solve the CS-LME reformulation. We show that the CS-LME reformulation inherits the original correlative sparsity pattern, and the CS-SOS relaxation provides sharper lower bounds when applied to the CS-LME reformulation, compared with when it is applied to the original problem. Moreover, the convergence of our approach is guaranteed under mild conditions. In numerical experiments, our new approach usually finds the global optimal value (up to a negligible error) with a low relaxation order for cases where directly solving the problem fails to get an accurate approximation. Also, by properly exploiting the correlative sparsity, our CS-LME approach requires less computational time than the original LME approach to reach the same accuracy level.
KW - polynomial optimization
KW - correlative sparsity
KW - Lagrange multiplier expressions
KW - Moment-SOS relaxations
UR - http://www.scopus.com/inward/record.url?scp=85182021556&partnerID=8YFLogxK
U2 - 10.1137/22M1515689
DO - 10.1137/22M1515689
M3 - Journal article
SN - 1052-6234
VL - 34
SP - 127
EP - 162
JO - SIAM Journal on Optimization
JF - SIAM Journal on Optimization
IS - 1
ER -