An approach to constrained polynomial optimization via nonnegative circuit polynomials and geometric programming (Q1994127)

From MaRDI portal
Revision as of 04:17, 17 July 2024 by ReferenceBot (talk | contribs) (‎Changed an Item)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
scientific article
Language Label Description Also known as
English
An approach to constrained polynomial optimization via nonnegative circuit polynomials and geometric programming
scientific article

    Statements

    An approach to constrained polynomial optimization via nonnegative circuit polynomials and geometric programming (English)
    0 references
    0 references
    0 references
    0 references
    1 November 2018
    0 references
    The authors use the theory of sums of non-negative circuit polynomials (`sonc-theory') developed in [\textit{S. Iliman} and \textit{T. de Wolff}, SIAM J. Optim. 26, No. 2, 1128--1146 (2016; Zbl 1380.12001)] to adress the problem of constrained polynomial optimization. This time they put ideas from \textit{M. Ghasemi} and \textit{M. Marshall} [``Lower bounds for a polynomial on a basic closed semialgebraic set using geometric programming'', Preprint, \url{arXiv 1311.3726}] developed for constrained optimization with geometric programming from the sum-of-squares approach into the sonc-setting. \par In the cited review the concepts of ST-polynomial and circuit polynomial are defined and the ideas of sonc-theory to find lower bounds for $f^*$ are described. We need them in the following. \par Let $ f(\mathbf x)$ and $g_1(\mathbf x),\ldots, g_s(\mathbf x)$ be elements of the polynomial ring $\mathbb R[\mathbf x].$ Let $K= \{{\mathbf x}: g_1({\mathbf x})\geq 0,..., g_s({\mathbf x})\geq 0\}$ be a basic semi-algebraic set. Let $f_K^*:=\inf_{\mathbf x\in K} f(\mathbf x).$ The problem is to find good lower bounds for $f_K^*.$ \par The present setup is the following: Consider for $\boldsymbol{\mu} \in [0,\infty[^s$ the linear form \newline $G(\boldsymbol{\mu})({\mathbf x})= f-\sum_{i=1}^s \mu_i g_i= -\sum_{i=0}^s\mu_i g_i$ (with $\mu_0=1, g_0=-f$) and rewrite it as an ST-polynomial whenever a given $\boldsymbol{\mu} \in [0,\infty[^s$ makes this possible: \[ G(\boldsymbol{\mu})({\mathbf x})= \sum_{j=0}^r G(\boldsymbol{\mu})_{\alpha(j)} {\mathbf x}^{\alpha(j)} + \sum_{\beta \in \Delta} G(\boldsymbol{\mu})_{\beta} {\mathbf x}^{\beta}. \] Sonc-theory allows, given $\boldsymbol{\mu},$ to find via the fast method of geometric programming a maximal real value $G(\boldsymbol{\mu})_{\text{sonc}}$ such that the polynomial $G(\boldsymbol{\mu})({\mathbf x})-G(\boldsymbol{\mu})_{\text{sonc}}{\mathbf x}^{\alpha(0)}$ is sum of nonnegative circuit polynomials and hence nonnegative. The most important case for finding a lower bound for $f_K^*$ is the case $\alpha(0)=0\in\mathbb Z_{\geq 0}^n.$ (If $ G(\boldsymbol{\mu})({\mathbf x})$ cannot be written as ST-polynomial put $G(\boldsymbol{\mu})_{\text{sonc}}=-\infty.$ ) \par Let $s(f,{\mathbf g})=\sup\{G(\boldsymbol{\mu})_{\text{sonc}}: \boldsymbol{\mu} \in \mathbb{R}_{\geq 0}^s \}.$ Note that if $\alpha(0)=0,$ then for every $\mu$ we have $f_K^*\geq G(\boldsymbol{\mu})_{\text{sonc}},$ and hence $s(f,{\mathbf g})$ can be expected to be a particularly good lower bound for $f_K^*;$ but unfortunately the fact that the values $G(\boldsymbol{\mu})_{\text{sonc}}$ can be found fast by geometric programming does not mean $s(f,{\mathbf g})$ can be found fast. \par In the last section of Iliman de Wolff [loc. cit.] first inroads to to find good lower bounds for $s(f,{\mathbf g})$ are made. The authors developed there a program which under specific conditions is signomial (i.e. geometric but with signs) and under much more stringent conditions is geometric. In the present paper they modify their previous program (which here carries the number (2.6)) into a program (3.2) which allows to weaken the assumptions for geometricity. They also present a further program (3.3) which is signomial again and which works almost without additional assumptions. The programs are small modifications of each other but the context and notation for any of them are far too technical to be presented here. As far as the reviewer knows signomial programs are unfortunately much slower than geometric ones. The authors also indicate conditions under which the programs (3.2) and (3.3) allow to get the exact value $s(f,{\mathbf g})$. \par In section 4 examples of constrained optimization via geometric programming and comparisons to Lasserre's relaxations are given. Let $f_{sos}^{(d)}$ be the $d$-th Lassere relaxation of the constrained problem. It yields better and better lower bounds for $f_K^*$ as $d$ increases (provided it converges, which is guaranteed e.g. if $K$ is compact). \par The authors report the following results: 1. An optimization of the famous Motzkin polynomial over a constrained but unbounded domain is tried. Lasserre's 3rd relaxation yields $-\infty$ while sonc-theory and $f_{\text{sos}}^{(7)}$ yield the correct value $f_K^*$. 2. Minimization of $f=1+x^4y^2+xy$ over the variety defined by $\frac{1}{2}+x^2y^4-x^2y^6=0$ is tried. The first few Lasserre relaxations yield $-\infty$ while the eighth relaxation yields the correct value $0.4474$. The program (3.2) is geometric in this case and yields the correct value via the Matlab CVX solver immediately. If one modifies $f$ by multiplying the exponents by a factor 10, one gets with GloptiPoly (i.e. Lasserre's relaxation) the correct result only after 18 minutes while the GP/SONC approach is insensitive to the degree augmentation and yields the result again after a second. 3. A 5-term 3-variable degree 4 polynomial $f$ subjected to a condition $g=0$ with another such polynomial $g$ is minimized. Gloptipoly gives the answer after 10 hours in the 20-th relaxation without providing a certificate of optimality, while the GP/SONC approach yields the answer in a second. 4. A homogeneous Motzkin form on the unit sphere is minimized and on grounds of sonc theory one can say immediately that the minimum must be 0; although the problem is infeasible for program (3.2). \par In section 5 the authors initiate a generalization of their theory to non-ST-polynomials. Consider unconstrained optimization. Assume $f$ is a non-ST-polynomial. Then by `triangulating' the set of exponents $\{\alpha(0),\ldots,\alpha(d)\}$ pertaining to monomial squares in $f$, it is still possible to write $f$ as a sum $f=\sum_{i=1}^k h_i$ where the $h_i$ are $ST$-polynomials. For suitable triangulations, certain quantities $m_i^*$ that can be extracted from applying program (2.6) to the $h_i$ and can then be used to obtain a lower bound for $f^*=\inf f({\mathbf x}).$ A first theoretical result is presented with proof. An example with constrained optimization is also given. The authors promise a follow-up paper on the topic.
    0 references
    semidefinite programming
    0 references
    sum of squares
    0 references
    triangulation
    0 references
    geometric programming
    0 references
    nonnegative polynomial
    0 references
    sum of nonnegative circuit polynomials
    0 references
    certificate
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references