A Correlatively Sparse Lagrange Multiplier Expression Relaxation for Polynomial Optimization

From MaRDI portal
Publication:6136658

DOI10.1137/22M1515689arXiv2208.03979OpenAlexW4390613906MaRDI QIDQ6136658FDOQ6136658


Authors: Zheng Qu, Xindong Tang Edit this on Wikidata


Publication date: 17 January 2024

Published in: SIAM Journal on Optimization (Search for Journal in Brave)

Abstract: In this paper, we consider polynomial optimization with correlative sparsity. For such problems, one may apply the correlative sparse sum of squares (CS-SOS) relaxations to exploit the correlative sparsity, which produces semidefinite programs of smaller sizes, compared with the classical moment-SOS relaxations. The Lagrange multiplier expression (LME) is also a useful tool to construct tight SOS relaxations for polynomial optimization problems, which leads to convergence with a lower relaxation order. However, the correlative sparsity is usually corrupted by the LME based reformulation. We propose a novel parametric LME approach which inherits the original correlative sparsity pattern, that we call correlative sparse LME (CS-LME). The reformulation using CS-LMEs can be solved using the CS-SOS relaxation to obtain sharper lower bound and convergence is guaranteed under mild conditions. Numerical examples are presented to show the efficiency of our new approach.


Full work available at URL: https://arxiv.org/abs/2208.03979







Cites Work


Cited In (2)





This page was built for publication: A Correlatively Sparse Lagrange Multiplier Expression Relaxation for Polynomial Optimization

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6136658)