Gradient-Type Methods for Optimization Problems with Polyak-Łojasiewicz Condition: Early Stopping and Adaptivity to Inexactness Parameter
From MaRDI portal
Publication:6090757
DOI10.1007/978-3-031-22990-9_2arXiv2212.04226OpenAlexW4313343023MaRDI QIDQ6090757
Fedor S. Stonyakin, Ilya A. Kuruzov, Mohammad S. Alkousa
Publication date: 17 November 2023
Published in: Communications in Computer and Information Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2212.04226
Cites Work
- Unnamed Item
- First-order methods of smooth convex optimization with inexact oracle
- Universal gradient methods for convex optimization problems
- Lectures on convex optimization
- Global optimization with non-convex constraints. Sequential and parallel algorithms
- A geometric analysis of phase retrieval
- Safe global optimization of expensive noisy black-box functions in the \(\delta \)-Lipschitz framework
- Inverse and ill-posed problems. Theory and applications.
- Smooth Optimization with Approximate Gradient
- First-Order Methods in Optimization
- Sequential Subspace Optimization for Quasar-Convex Optimization Problems with Inexact Gradient
- Inexact model: a framework for optimization and variational inequalities
- Stopping rules for gradient methods for non-convex problems with additive noise in gradient
This page was built for publication: Gradient-Type Methods for Optimization Problems with Polyak-Łojasiewicz Condition: Early Stopping and Adaptivity to Inexactness Parameter