New analysis of linear convergence of gradient-type methods via unifying error bound conditions
DOI10.1007/s10107-018-01360-1zbMath1435.90111arXiv1606.00269OpenAlexW2963154066MaRDI QIDQ2297652
Publication date: 20 February 2020
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1606.00269
error boundlinear convergencegradient descentproximal point algorithmforward-backward splitting algorithmproximal alternating linearized minimizationNesterov's accelerationdual objective functionresidual measure operator
Convex programming (90C25) Abstract computational complexity for mathematical programming problems (90C60) Numerical methods involving duality (49M29) Numerical optimization and variational techniques (65K10)
Related Items
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- On the rate of convergence of the proximal alternating linearized minimization algorithm for convex problems
- Optimized first-order methods for smooth convex minimization
- Proximal alternating linearized minimization for nonconvex and nonsmooth problems
- Smooth strongly convex interpolation and exact worst-case performance of first-order methods
- The restricted strong convexity revisited: analysis of equivalence to error bound and quadratic growth
- Iteration complexity analysis of block coordinate descent methods
- On the convergence of the proximal algorithm for nonsmooth functions involving analytic features
- Asymptotic convergence of nonlinear contraction semigroups in Hilbert space
- On equiwellset minimum problems
- Error bounds and convergence analysis of feasible descent methods: A general approach
- Introductory lectures on convex optimization. A basic course.
- Exact worst-case convergence rates of the proximal gradient method for composite convex minimization
- From error bounds to the complexity of first-order descent methods for convex functions
- A unified approach to error bounds for structured convex optimization problems
- Calculus of the exponent of Kurdyka-Łojasiewicz inequality and its applications to linear convergence of first-order methods
- Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
- Fast and safe: accelerated gradient methods with optimality certificates and underestimate sequences
- Performance of first-order methods for smooth convex minimization: a novel approach
- A dual algorithm for a class of augmented convex signal recovery models
- Restricted strong convexity and its applications to convergence analysis of gradient-type methods in convex optimization
- Projected shrinkage algorithm for box-constrained \(\ell _1\)-minimization
- Linear convergence of first order methods for non-strongly convex optimization
- Convergence of the forward-backward algorithm: beyond the worst-case with the help of geometry
- The Rate of Convergence of Nesterov's Accelerated Forward-Backward Method is Actually Faster Than $1/k^2$
- Linear Convergence of Descent Methods for the Unconstrained Minimization of Restricted Strongly Convex Functions
- A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
- A Proximal-Gradient Homotopy Method for the Sparse Least-Squares Problem
- Augmented $\ell_1$ and Nuclear-Norm Models with a Globally Linearly Convergent Algorithm
- Asynchronous Stochastic Coordinate Descent: Parallelism and Convergence Properties
- Phase Retrieval via Wirtinger Flow: Theory and Algorithms
- Asymptotic Convergence Analysis of the Proximal Point Algorithm
- Fixed-Point Continuation for $\ell_1$-Minimization: Methodology and Convergence
- Implicit Functions and Solution Mappings
- Some continuity properties of polyhedral multifunctions
- On the Linear Convergence of Descent Methods for Convex Essentially Smooth Minimization
- Variational Analysis
- On Faster Convergence of Cyclic Block Coordinate Descent-type Methods for Strongly Convex Minimization
- On the Convergence of a Regularized Jacobi Algorithm for Convex Optimization
- On the Estimation Performance and Convergence Rate of the Generalized Power Method for Phase Synchronization
- Convergence Rates of Inertial Forward-Backward Algorithms
- Second-order growth, tilt stability, and metric regularity of the subdifferential
- Error Bounds, Quadratic Growth, and Linear Convergence of Proximal Methods
- Metric subregularity of the convex subdifferential in Banach spaces
- Convex Analysis
- An introduction to continuous optimization for imaging
- Convex analysis and monotone operator theory in Hilbert spaces