Faster subgradient methods for functions with Hölderian growth
From MaRDI portal
Publication:2297653
DOI10.1007/S10107-018-01361-0zbMATH Open1461.65161arXiv1704.00196OpenAlexW3104046271MaRDI QIDQ2297653FDOQ2297653
Authors: Patrick R. Johnstone, Pierre Moulin
Publication date: 20 February 2020
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Abstract: The purpose of this manuscript is to derive new convergence results for several subgradient methods applied to minimizing nonsmooth convex functions with H"olderian growth. The growth condition is satisfied in many applications and includes functions with quadratic growth and weakly sharp minima as special cases. To this end there are three main contributions. First, for a constant and sufficiently small stepsize, we show that the subgradient method achieves linear convergence up to a certain region including the optimal set, with error of the order of the stepsize. Second, if appropriate problem parameters are known, we derive a decaying stepsize which obtains a much faster convergence rate than is suggested by the classical result for the subgradient method. Thirdly we develop a novel "descending stairs" stepsize which obtains this faster convergence rate and also obtains linear convergence for the special case of weakly sharp functions. We also develop an adaptive variant of the "descending stairs" stepsize which achieves the same convergence rate without requiring an error bound constant which is difficult to estimate in practice.
Full work available at URL: https://arxiv.org/abs/1704.00196
Recommendations
- On Convergence Properties of a Subgradient Method
- Subgradient methods for sharp weakly convex functions
- A subgradient method with non-monotone line search
- General Hölder smooth convergence rates follow from specialized rates assuming growth bounds
- Convergence rates for deterministic and stochastic subgradient methods without Lipschitz continuity
Numerical mathematical programming methods (65K05) Convex programming (90C25) Nonlinear programming (90C30)
Cites Work
- Coordinate descent algorithms for lasso penalized regression
- Distributed optimization and statistical learning via the alternating direction method of multipliers
- Convex analysis and monotone operator theory in Hilbert spaces
- Convergence rate of incremental subgradient algorithms
- Robust Stochastic Approximation Approach to Stochastic Programming
- The \(L_1\) penalized LAD estimator for high dimensional linear regression
- On convergence rates of subgradient optimization methods
- A first-order primal-dual algorithm for convex problems with applications to imaging
- Convergence of non-smooth descent methods using the Kurdyka-Łojasiewicz inequality
- Title not available (Why is that?)
- Approximation accuracy, gradient methods, and error bound for structured convex optimization
- Error bounds in mathematical programming
- Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
- Global error bounds for piecewise convex polynomials
- Title not available (Why is that?)
- Finite termination of the proximal point algorithm
- Weak Sharp Minima in Mathematical Programming
- The Łojasiewicz Inequality for Nonsmooth Subanalytic Functions with Applications to Subgradient Dynamical Systems
- Error bounds and convergence analysis of feasible descent methods: A general approach
- Online Learning with Kernels
- The effect of deterministic noise in subgradient methods
- First-order algorithm with \({\mathcal{O}(\ln(1/\epsilon))}\) convergence for \({\epsilon}\)-equilibrium in two-person zero-sum games
- Title not available (Why is that?)
- An \(O(n)\) algorithm for projecting a vector on the intersection of a hyperplane and a box in \(\mathbb R^n\)
- Weak sharp minima revisited. I: Basic theory
- A three-operator splitting scheme and its optimization applications
- Asymptotic analysis of high-dimensional LAD regression with Lasso smoother
- Nonlinear programming methods in the presence of noise
- Linearly convergent away-step conditional gradient for non-strongly convex functions
- A unified approach to error bounds for structured convex optimization problems
- Faster subgradient methods for functions with Hölderian growth
- Activity Identification and Local Linear Convergence of Forward--Backward-type Methods
- ``Efficient” Subgradient Methods for General Convex Optimization
- On the convergence rate for stochastic approximation in the nonsmooth setting
- RSG: Beating Subgradient Method without Smoothness and Strong Convexity
- On proximal subgradient splitting method for minimizing the sum of two nonsmooth convex functions
- A Geometrically Convergent Subgradient Optimization Method for Nonlinearly Constrained Convex Programs
Cited In (14)
- A subgradient method with constant step-size for \(\ell_1\)-composite optimization
- Exact penalties for decomposable convex optimization problems
- Stochastic algorithms with geometric step decay converge linearly on sharp functions
- A strict complementarity approach to error bound and sensitivity of solution of conic programs
- On Convergence Properties of a Subgradient Method
- General Hölder smooth convergence rates follow from specialized rates assuming growth bounds
- A simple nearly optimal restart scheme for speeding up first-order methods
- An Optimal-Storage Approach to Semidefinite Programming Using Approximate Complementarity
- Convergence results of a new monotone inertial forward-backward splitting algorithm under the local Hölder error bound condition
- Variable sample-size optimistic mirror descent algorithm for stochastic mixed variational inequalities
- Convergence rates of subgradient methods for quasi-convex optimization problems
- Faster subgradient methods for functions with Hölderian growth
- On the Simplicity and Conditioning of Low Rank Semidefinite Programs
- Radial duality. II: Applications and algorithms
This page was built for publication: Faster subgradient methods for functions with Hölderian growth
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2297653)