A Correlatively Sparse Lagrange Multiplier Expression Relaxation for Polynomial Optimization
From MaRDI portal
Publication:6136658
Abstract: In this paper, we consider polynomial optimization with correlative sparsity. For such problems, one may apply the correlative sparse sum of squares (CS-SOS) relaxations to exploit the correlative sparsity, which produces semidefinite programs of smaller sizes, compared with the classical moment-SOS relaxations. The Lagrange multiplier expression (LME) is also a useful tool to construct tight SOS relaxations for polynomial optimization problems, which leads to convergence with a lower relaxation order. However, the correlative sparsity is usually corrupted by the LME based reformulation. We propose a novel parametric LME approach which inherits the original correlative sparsity pattern, that we call correlative sparse LME (CS-LME). The reformulation using CS-LMEs can be solved using the CS-SOS relaxation to obtain sharper lower bound and convergence is guaranteed under mild conditions. Numerical examples are presented to show the efficiency of our new approach.
Recommendations
- Sums of Squares and Semidefinite Program Relaxations for Polynomial Optimization Problems with Structured Sparsity
- Tight relaxations for polynomial optimization and Lagrange multiplier expressions
- Generalized Lagrangian Duals and Sums of Squares Relaxations of Sparse Polynomial Optimization Problems
- Convergent SDP‐Relaxations in Polynomial Optimization with Sparsity
- Convergent SDP-relaxations for polynomial optimization with sparsity
Cites work
- scientific article; zbMATH DE number 527343 (Why is no real title available?)
- scientific article; zbMATH DE number 554762 (Why is no real title available?)
- scientific article; zbMATH DE number 1490041 (Why is no real title available?)
- A Lagrange multiplier expression method for bilevel polynomial optimization
- A complete semidefinite algorithm for detecting copositive matrices and tensors
- A note on sparse SOS and SDP relaxations for polynomial optimization problems over symmetric cones
- A note on the representation of positive polynomials with structured sparsity
- An introduction to polynomial and semi-algebraic optimization
- Certifying convergence of Lasserre's hierarchy via flat truncation
- Convergent SDP‐Relaxations in Polynomial Optimization with Sparsity
- Convex generalized Nash equilibrium problems and polynomial optimization
- Detecting Global Optimality and Extracting Solutions in GloptiPoly
- Global optimization with polynomials and the problem of moments
- GloptiPoly 3: moments, optimization and semidefinite programming
- Linear optimization with cones of moments and nonnegative polynomials
- Minimizing polynomials via sum of squares over the gradient ideal
- Moment and Polynomial Optimization
- On the Lasserre hierarchy of semidefinite programming relaxations of convex polynomial optimization problems
- Optimality conditions and finite convergence of Lasserre's hierarchy
- Polynomial optimization with real varieties
- Representations of positive polynomials on noncompact semialgebraic sets via KKT ideals
- Sparse Polynomial Optimization
- Sparse SOS Relaxations for Minimizing Functions that are Summations of Small Polynomials
- Sparse polynomial optimisation for neural network verification
- Sums of Squares and Semidefinite Program Relaxations for Polynomial Optimization Problems with Structured Sparsity
- Sums of squares, moment matrices and optimization over polynomials
- TSSOS: A Moment-SOS Hierarchy That Exploits Term Sparsity
- The \(\mathcal A\)-truncated \(K\)-moment problem
- The geometry of SDP-exactness in quadratic optimization
- The moment-SOS hierarchy
- The saddle point problem of polynomials
- Tight relaxations for polynomial optimization and Lagrange multiplier expressions
Cited in
(2)
This page was built for publication: A Correlatively Sparse Lagrange Multiplier Expression Relaxation for Polynomial Optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6136658)