From error bounds to the complexity of first-order descent methods for convex functions
From MaRDI portal
Publication:1675251
DOI10.1007/s10107-016-1091-6zbMath1373.90076arXiv1510.08234OpenAlexW2962826994MaRDI QIDQ1675251
Trong Phong Nguyen, Juan Peypouquet, Jérôme Bolte, Bruce W. Suter
Publication date: 27 October 2017
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1510.08234
error boundsconvex minimizationLASSOcompressed sensingforward-backward methodcomplexity of first-order methodsKL inequality
Numerical mathematical programming methods (65K05) Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Abstract computational complexity for mathematical programming problems (90C60)
Related Items
Accelerating inexact successive quadratic approximation for regularized optimization through manifold identification, Convergence of an asynchronous block-coordinate forward-backward algorithm for convex composite optimization, Quadratic error bound of the smoothed gap and the restarted averaged primal-dual hybrid gradient, First-order methods for convex optimization, On optimal universal first-order methods for minimizing heterogeneous sums, On continuous selections of polynomial functions, Convergence Rate Analysis of a Dykstra-Type Projection Algorithm, Efficient iterative method for SOAV minimization problem with linear equality and box constraints and its linear convergence, Convergence results of a new monotone inertial forward-backward splitting algorithm under the local Hölder error bound condition, Stochastic quasi-subgradient method for stochastic quasi-convex feasibility problems, Primal necessary characterizations of transversality properties, On the Complexity Analysis of the Primal Solutions for the Accelerated Randomized Dual Coordinate Ascent, Interior quasi-subgradient method with non-Euclidean distances for constrained quasi-convex optimization problems in Hilbert spaces, Local convergence of the heavy-ball method and iPiano for non-convex optimization, Implicit error bounds for Picard iterations on Hilbert spaces, Scaled relative graphs: nonexpansive operators via 2D Euclidean geometry, Tensor Canonical Correlation Analysis With Convergence and Statistical Guarantees, A family of functional inequalities: Łojasiewicz inequalities and displacement convex functions, Approaching nonsmooth nonconvex optimization problems through first order dynamical systems with hidden acceleration and Hessian driven damping terms, Kurdyka-Łojasiewicz exponent via inf-projection, Constant step stochastic approximations involving differential inclusions: stability, long-run convergence and applications, Multiple-sets split quasi-convex feasibility problems: Adaptive subgradient methods with convergence guarantee, Neural network for nonsmooth pseudoconvex optimization with general convex constraints, Error bounds for parametric polynomial systems with applications to higher-order stability analysis and convergence rates, Perturbation of error bounds, Quadratic growth conditions and uniqueness of optimal solution to Lasso, The equivalence of three types of error bounds for weakly and approximately convex functions, A simple globally convergent algorithm for the nonsmooth nonconvex single source localization problem, Newton acceleration on manifolds identified by proximal gradient methods, Error bounds, facial residual functions and applications to the exponential cone, Fast convergence of inertial dynamics with Hessian-driven damping under geometry assumptions, Nonlocal error bounds for piecewise affine functions, A speed restart scheme for a dynamics with Hessian-driven damping, Optimal non-asymptotic analysis of the Ruppert-Polyak averaging stochastic algorithm, FISTA is an automatic geometrically optimized algorithm for strongly convex functions, Thresholding gradient methods in Hilbert spaces: support identification and linear convergence, Radial duality. II: Applications and algorithms, Differentiating Nonsmooth Solutions to Parametric Monotone Inclusion Problems, Screening Rules and its Complexity for Active Set Identification, On the relationship between the Kurdyka-Łojasiewicz property and error bounds on Hadamard manifolds, Linear Convergence of a Proximal Alternating Minimization Method with Extrapolation for \(\boldsymbol{\ell_1}\) -Norm Principal Component Analysis, Convergence rates of the heavy-ball method under the Łojasiewicz property, Convergence of the forward-backward algorithm: beyond the worst-case with the help of geometry, Revisiting the approximate Carathéodory problem via the Frank-Wolfe algorithm, Variance reduction for root-finding problems, Sample Complexity of Sample Average Approximation for Conditional Stochastic Optimization, Iterative regularization via dual diagonal descent, Optimal convergence rates for damped inertial gradient dynamics with flat geometries, Dual block-coordinate forward-backward algorithm with application to deconvolution and deinterlacing of video sequences, A simple nearly optimal restart scheme for speeding up first-order methods, General Hölder smooth convergence rates follow from specialized rates assuming growth bounds, Extragradient method in optimization: convergence and complexity, Active Set Complexity of the Away-Step Frank--Wolfe Algorithm, Nonsmooth optimization using Taylor-like models: error bounds, convergence, and termination criteria, Random Function Iterations for Consistent Stochastic Feasibility, On the Quadratic Convergence of the Cubic Regularization Method under a Local Error Bound Condition, Local Minimizers of Semi-Algebraic Functions from the Viewpoint of Tangencies, Convergence Rates of Damped Inertial Dynamics under Geometric Conditions and Perturbations, The gradient projection algorithm for a proximally smooth set and a function with Lipschitz continuous gradient, Convergence of inertial dynamics and proximal algorithms governed by maximally monotone operators, Convergence rates for an inertial algorithm of gradient type associated to a smooth non-convex minimization, On the proximal gradient algorithm with alternated inertia, The restricted strong convexity revisited: analysis of equivalence to error bound and quadratic growth, RSG: Beating Subgradient Method without Smoothness and Strong Convexity, Convergence rates of an inertial gradient descent algorithm under growth and flatness conditions, Linear convergence of inexact descent method and inexact proximal gradient algorithms for lower-order regularization problems, On the interplay between acceleration and identification for the proximal gradient algorithm, Constraint qualifications for Karush-Kuhn-Tucker conditions in multiobjective optimization, Kurdyka-Łojasiewicz property of zero-norm composite functions, Calculus of the exponent of Kurdyka-Łojasiewicz inequality and its applications to linear convergence of first-order methods, Nearly optimal first-order methods for convex optimization under gradient norm measure: an adaptive regularization approach, On the linear convergence of forward-backward splitting method. I: Convergence analysis, Optimal Convergence Rates for Nesterov Acceleration, Sharpness, Restart, and Acceleration, Transversality properties: primal sufficient conditions, Level-set subdifferential error bounds and linear convergence of Bregman proximal gradient method, New analysis of linear convergence of gradient-type methods via unifying error bound conditions, Restarting the accelerated coordinate descent method with a rough strong convexity estimate, A sufficient condition for asymptotically well behaved property of convex polynomials, The modified second APG method for DC optimization problems, Convergence Rates for Deterministic and Stochastic Subgradient Methods without Lipschitz Continuity, Novel Reformulations and Efficient Algorithms for the Generalized Trust Region Subproblem, Alternating projections with applications to Gerchberg-Saxton error reduction, Variational analysis perspective on linear convergence of some first order methods for nonsmooth convex optimization problems, Moduli of regularity and rates of convergence for Fejér monotone sequences, Convergence rates of forward-Douglas-Rachford splitting method, Accelerate stochastic subgradient method by leveraging local growth condition, Computational approaches to non-convex, sparsity-inducing multi-penalty regularization, Proximal-Like Incremental Aggregated Gradient Method with Linear Convergence Under Bregman Distance Growth Conditions, On linear convergence of non-Euclidean gradient methods without strong convexity and Lipschitz gradient continuity, Distributed block-diagonal approximation methods for regularized empirical risk minimization, Convergence rates of subgradient methods for quasi-convex optimization problems, Inertial proximal incremental aggregated gradient method with linear convergence guarantees, Quadratic optimization with orthogonality constraint: explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods, Curiosities and counterexamples in smooth convex optimization, Convergence rates for the heavy-ball continuous dynamics for non-convex optimization, under Polyak-Łojasiewicz condition, A Bregman Forward-Backward Linesearch Algorithm for Nonconvex Composite Optimization: Superlinear Convergence to Nonisolated Local Minima, Convergence Rate Analysis of a Sequential Convex Programming Method with Line Search for a Class of Constrained Difference-of-Convex Optimization Problems, Deep Neural Networks for Inverse Problems with Pseudodifferential Operators: An Application to Limited-Angle Tomography, The Exact Modulus of the Generalized Concave Kurdyka-Łojasiewicz Property, Hölderian Error Bounds and Kurdyka-Łojasiewicz Inequality for the Trust Region Subproblem, Avoiding bad steps in Frank-Wolfe variants, First-Order Algorithms for a Class of Fractional Optimization Problems, Proximal Gradient Methods for Machine Learning and Imaging, Perturbation techniques for convergence analysis of proximal gradient method and other first-order algorithms via variational analysis, Restarting Frank-Wolfe: faster rates under Hölderian error bounds
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Proximal alternating linearized minimization for nonconvex and nonsmooth problems
- There is no variational characterization of the cycles in the method of periodic projections
- New fractional error bounds for polynomial systems with applications to Hölderian stability in optimization and spectral theory of tensors
- Nonlinear error bounds for lower semicontinuous functions on metric spaces
- On the convergence of the proximal algorithm for nonsmooth functions involving analytic features
- Finite termination of the proximal point algorithm
- Asymptotic convergence of nonlinear contraction semigroups in Hilbert space
- On gradients of functions definable in o-minimal structures
- Error bounds and convergence analysis of feasible descent methods: A general approach
- On semi- and subanalytic geometry
- Error bounds for analytic systems and their applications
- Error bounds in mathematical programming
- Conditioning and upper-Lipschitz inverse subdifferentials in nonsmooth optimization problems
- Global error bounds with fractional exponents
- A Frank--Wolfe type theorem for convex polynomial programs
- The value function approach to convergence analysis in composite optimization
- Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
- Global error bounds for piecewise convex polynomials
- Splitting methods with variable metric for Kurdyka-Łojasiewicz functions and general convergence rates
- On damped second-order gradient systems
- Linearly convergent away-step conditional gradient for non-strongly convex functions
- Error bounds for parametric polynomial systems with applications to higher-order stability analysis and convergence rates
- Majorization-Minimization Procedures and Convergence of SQP Methods for Semi-Algebraic and Tame Programs
- Global Hölderian Error Bound for Nondegenerate Polynomials
- Proximal Splitting Methods in Signal Processing
- Convex Optimization in Normed Spaces
- Activity Identification and Local Linear Convergence of Forward--Backward-type Methods
- Weak Sharp Minima in Mathematical Programming
- Proximal Alternating Minimization and Projection Methods for Nonconvex Problems: An Approach Based on the Kurdyka-Łojasiewicz Inequality
- Clarke Subgradients of Stratifiable Functions
- Characterizations of Łojasiewicz inequalities: Subgradient flows, talweg, convexity
- A Condition Number for Differentiable Convex Inequalities
- Global Regularity Theorems
- An Application of Error Bounds for Convex Programming in a Linear Space
- Monotone Operators and the Proximal Point Algorithm
- Extension of Hoffman’s Error Bound to Polynomial Systems
- Penalty functions in subanalytic optimization
- Sharp Estimates for Hoffman's Constant for Systems of Linear Inequalities and Equalities
- An iterative thresholding algorithm for linear inverse problems with a sparsity constraint
- Convergence rate analysis and error bounds for projection algorithms in convex feasibility problems
- On Projection Algorithms for Solving Convex Feasibility Problems
- The Łojasiewicz Inequality for Nonsmooth Subanalytic Functions with Applications to Subgradient Dynamical Systems
- Convergence of the Iterates of Descent Methods for Analytic Cost Functions
- Signal Recovery by Proximal Forward-Backward Splitting
- State Constraints in Convex Control Problems of Bolza
- Sur le problème de la division
- Convex analysis and monotone operator theory in Hilbert spaces