A Correlatively Sparse Lagrange Multiplier Expression Relaxation for Polynomial Optimization

From MaRDI portal
Publication:6136658




Abstract: In this paper, we consider polynomial optimization with correlative sparsity. For such problems, one may apply the correlative sparse sum of squares (CS-SOS) relaxations to exploit the correlative sparsity, which produces semidefinite programs of smaller sizes, compared with the classical moment-SOS relaxations. The Lagrange multiplier expression (LME) is also a useful tool to construct tight SOS relaxations for polynomial optimization problems, which leads to convergence with a lower relaxation order. However, the correlative sparsity is usually corrupted by the LME based reformulation. We propose a novel parametric LME approach which inherits the original correlative sparsity pattern, that we call correlative sparse LME (CS-LME). The reformulation using CS-LMEs can be solved using the CS-SOS relaxation to obtain sharper lower bound and convergence is guaranteed under mild conditions. Numerical examples are presented to show the efficiency of our new approach.



Cites work







This page was built for publication: A Correlatively Sparse Lagrange Multiplier Expression Relaxation for Polynomial Optimization

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6136658)