New computational guarantees for solving convex optimization problems with first order methods, via a function growth condition measure
From MaRDI portal
Publication:1659678
DOI10.1007/s10107-017-1164-1zbMath1403.90549arXiv1511.02974OpenAlexW2964164027MaRDI QIDQ1659678
Publication date: 22 August 2018
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1511.02974
Analysis of algorithms and problem complexity (68Q25) Numerical mathematical programming methods (65K05) Convex programming (90C25) Large-scale problems in mathematical programming (90C06)
Related Items
Faster first-order primal-dual methods for linear programming using restarts and sharpness, Accelerated first-order methods for hyperbolic programming, RSG: Beating Subgradient Method without Smoothness and Strong Convexity, Sharpness, Restart, and Acceleration, Convergence rates of subgradient methods for quasi-convex optimization problems, ``Efficient” Subgradient Methods for General Convex Optimization
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Smooth minimization of non-smooth functions
- First-order algorithm with \({\mathcal{O}(\ln(1/\epsilon))}\) convergence for \({\epsilon}\)-equilibrium in two-person zero-sum games
- Smoothing technique and its applications in semidefinite optimization
- Weak sharp minima revisited. III: Error bounds for differentiable convex inclusions
- Perturbation theory of nonlinear programs when the set of optimal solutions is not a singleton
- Finite termination of the proximal point algorithm
- Introductory lectures on convex optimization. A basic course.
- A Gauss-Newton method for convex composite optimization
- Adaptive restart for accelerated gradient schemes
- Weak sharp minima revisited. II: Application to linear regularity and error bounds
- A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
- Smoothing and First Order Methods: A Unified Framework
- Weak Sharp Minima in Mathematical Programming
- Perturbation analysis of optimization problems in banach spaces
- Hoffman's Error Bound, Local Controllability, and Sensitivity Analysis
- ``Efficient” Subgradient Methods for General Convex Optimization
- Optimal stability and eigenvalue multiplicity