The effect of perturbations on the convergence rates of optimization algorithms
From MaRDI portal
Publication:1051402
DOI10.1007/BF01448383zbMath0514.65050OpenAlexW2018333764MaRDI QIDQ1051402
Publication date: 1983
Published in: Applied Mathematics and Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/bf01448383
Related Items (5)
Convergence of algorithms for perturbed optimization problems ⋮ Approximate quasi-Newton methods ⋮ Newton-Goldstein convergence rates for convex constrained minimization problems with singular solutions ⋮ Newton's method for singular constrained optimization problems ⋮ Rates of convergence for adaptive Newton methods
Cites Work
- Extremal types for certain \(L^ p \)minimization problems and associated large scale nonlinear programs
- Convergence Rates for Conditional Gradient Sequences Generated by Implicit Step Length Rules
- Global and Asymptotic Convergence Rate Estimates for a Class of Projected Gradient Processes
- On the Goldstein-Levitin-Polyak gradient projection method
- Rates of Convergence for Conditional Gradient Algorithms Near Singular and Nonsingular Extremals
- Unnamed Item
This page was built for publication: The effect of perturbations on the convergence rates of optimization algorithms