``Efficient” Subgradient Methods for General Convex Optimization
From MaRDI portal
Publication:5506689
DOI10.1137/15M1027371zbMath1351.90129arXiv1605.08712MaRDI QIDQ5506689
Publication date: 13 December 2016
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1605.08712
Semidefinite programming (90C22) Convex programming (90C25) Numerical optimization and variational techniques (65K10)
Related Items (14)
Exact penalties for decomposable convex optimization problems ⋮ A Level-Set Method for Convex Optimization with a Feasible Solution Path ⋮ New computational guarantees for solving convex optimization problems with first order methods, via a function growth condition measure ⋮ A bound on the Carathéodory number ⋮ Radial duality. I: Foundations ⋮ Radial duality. II: Applications and algorithms ⋮ Radial Subgradient Method ⋮ A simple nearly optimal restart scheme for speeding up first-order methods ⋮ Accelerated first-order methods for hyperbolic programming ⋮ Amenable cones: error bounds without constraint qualifications ⋮ RSG: Beating Subgradient Method without Smoothness and Strong Convexity ⋮ Certifying Polynomial Nonnegativity via Hyperbolic Optimization ⋮ Faster subgradient methods for functions with Hölderian growth ⋮ Convergence Rates for Deterministic and Stochastic Subgradient Methods without Lipschitz Continuity
Cites Work
- Smooth minimization of non-smooth functions
- First-order algorithm with \({\mathcal{O}(\ln(1/\epsilon))}\) convergence for \({\epsilon}\)-equilibrium in two-person zero-sum games
- New computational guarantees for solving convex optimization problems with first order methods, via a function growth condition measure
- Accelerated first-order methods for hyperbolic programming
- Convex Analysis
This page was built for publication: ``Efficient” Subgradient Methods for General Convex Optimization