A Correlatively Sparse Lagrange Multiplier Expression Relaxation for Polynomial Optimization
From MaRDI portal
Publication:6136658
DOI10.1137/22M1515689arXiv2208.03979OpenAlexW4390613906MaRDI QIDQ6136658FDOQ6136658
Authors: Zheng Qu, Xindong Tang
Publication date: 17 January 2024
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Abstract: In this paper, we consider polynomial optimization with correlative sparsity. For such problems, one may apply the correlative sparse sum of squares (CS-SOS) relaxations to exploit the correlative sparsity, which produces semidefinite programs of smaller sizes, compared with the classical moment-SOS relaxations. The Lagrange multiplier expression (LME) is also a useful tool to construct tight SOS relaxations for polynomial optimization problems, which leads to convergence with a lower relaxation order. However, the correlative sparsity is usually corrupted by the LME based reformulation. We propose a novel parametric LME approach which inherits the original correlative sparsity pattern, that we call correlative sparse LME (CS-LME). The reformulation using CS-LMEs can be solved using the CS-SOS relaxation to obtain sharper lower bound and convergence is guaranteed under mild conditions. Numerical examples are presented to show the efficiency of our new approach.
Full work available at URL: https://arxiv.org/abs/2208.03979
Large-scale problems in mathematical programming (90C06) Semidefinite programming (90C22) Polynomial optimization (90C23)
Cites Work
- Detecting Global Optimality and Extracting Solutions in GloptiPoly
- Global optimization with polynomials and the problem of moments
- Minimizing polynomials via sum of squares over the gradient ideal
- GloptiPoly 3: moments, optimization and semidefinite programming
- Sums of squares, moment matrices and optimization over polynomials
- Sparse SOS Relaxations for Minimizing Functions that are Summations of Small Polynomials
- Title not available (Why is that?)
- Title not available (Why is that?)
- Sums of Squares and Semidefinite Program Relaxations for Polynomial Optimization Problems with Structured Sparsity
- Convergent SDP‐Relaxations in Polynomial Optimization with Sparsity
- A note on sparse SOS and SDP relaxations for polynomial optimization problems over symmetric cones
- Optimality conditions and finite convergence of Lasserre's hierarchy
- Representations of positive polynomials on noncompact semialgebraic sets via KKT ideals
- The \(\mathcal A\)-truncated \(K\)-moment problem
- Linear optimization with cones of moments and nonnegative polynomials
- An introduction to polynomial and semi-algebraic optimization
- Title not available (Why is that?)
- A note on the representation of positive polynomials with structured sparsity
- Certifying convergence of Lasserre's hierarchy via flat truncation
- Polynomial optimization with real varieties
- The saddle point problem of polynomials
- On the Lasserre hierarchy of semidefinite programming relaxations of convex polynomial optimization problems
- The geometry of SDP-exactness in quadratic optimization
- Tight relaxations for polynomial optimization and Lagrange multiplier expressions
- A complete semidefinite algorithm for detecting copositive matrices and tensors
- THE MOMENT-SOS HIERARCHY
- TSSOS: A Moment-SOS Hierarchy That Exploits Term Sparsity
- Sparse Polynomial Optimization
- Convex generalized Nash equilibrium problems and polynomial optimization
- A Lagrange Multiplier Expression Method for Bilevel Polynomial Optimization
- Moment and Polynomial Optimization
- Sparse polynomial optimisation for neural network verification
Cited In (2)
This page was built for publication: A Correlatively Sparse Lagrange Multiplier Expression Relaxation for Polynomial Optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6136658)