Ghost penalties in nonconvex constrained optimization: diminishing stepsizes and iteration complexity

From MaRDI portal
Publication:5000647

DOI10.1287/MOOR.2020.1079zbMATH Open1471.90137arXiv1709.03384OpenAlexW3128430476MaRDI QIDQ5000647FDOQ5000647

Lorenzo Lampariello, Gesualdo Scutari, Francisco Facchinei, Vyacheslav Kungurtsev

Publication date: 15 July 2021

Published in: Mathematics of Operations Research (Search for Journal in Brave)

Abstract: We consider nonconvex constrained optimization problems and propose a new approach to the convergence analysis based on penalty functions. We make use of classical penalty functions in an unconventional way, in that penalty functions only enter in the theoretical analysis of convergence while the algorithm itself is penalty-free. Based on this idea, we are able to establish several new results, including the first general analysis for diminishing stepsize methods in nonconvex, constrained optimization, showing convergence to generalized stationary points, and a complexity study for SQP-type algorithms.


Full work available at URL: https://arxiv.org/abs/1709.03384




Recommendations




Cites Work


Cited In (6)

Uses Software





This page was built for publication: Ghost penalties in nonconvex constrained optimization: diminishing stepsizes and iteration complexity

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5000647)