Convergence of the forward-backward algorithm: beyond the worst-case with the help of geometry
From MaRDI portal
Publication:2687067
DOI10.1007/s10107-022-01809-4MaRDI QIDQ2687067
Guillaume Garrigos, Lorenzo Rosasco, Silvia Villa
Publication date: 1 March 2023
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1703.09477
convergence rates; inverse problems; source condition; forward backward algorithm; Łojasiewicz property; conditioned functions
90C25: Convex programming
65K10: Numerical optimization and variational techniques
49M27: Decomposition methods
47J26: Fixed-point iterations
Related Items
Thresholding gradient methods in Hilbert spaces: support identification and linear convergence, Optimal Convergence Rates for Nesterov Acceleration, Constrained Consensus-Based Optimization, Fast convergence of inertial dynamics with Hessian-driven damping under geometry assumptions, An adaptive consensus based method for multi-objective optimization with uniform Pareto front approximation, A forward-backward algorithm with different inertial terms for structured non-convex minimization problems, New analysis of linear convergence of gradient-type methods via unifying error bound conditions
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- A mathematical introduction to compressive sensing
- Proximal alternating linearized minimization for nonconvex and nonsmooth problems
- The Łojasiewicz gradient inequality in the infinite-dimensional Hilbert space framework
- Well-posed optimization problems
- A block coordinate variable metric forward-backward algorithm
- Linear convergence of iterative soft-thresholding
- New fractional error bounds for polynomial systems with applications to Hölderian stability in optimization and spectral theory of tensors
- Quadratic growth and critical point stability of semi-algebraic functions
- Cauchy's method of minimization
- Model selection for regularized least-squares algorithm in learning theory
- The restricted isometry property and its implications for compressed sensing
- On the convergence of the proximal algorithm for nonsmooth functions involving analytic features
- Convergence to equilibrium for the backward Euler scheme and applications
- Metric subregularity and the proximal point method
- Finite termination of the proximal point algorithm
- On equiwellset minimum problems
- Un exemple concernant le comportement asymptotique de la solution du problème \(du/dt+\partial\varphi(\mu)\ni=0\)
- On the convergence of von Neumann's alternating projection algorithm for two sets
- Error bounds and convergence analysis of feasible descent methods: A general approach
- Conditioning and upper-Lipschitz inverse subdifferentials in nonsmooth optimization problems
- From error bounds to the complexity of first-order descent methods for convex functions
- A unified approach to error bounds for structured convex optimization problems
- Calculus of the exponent of Kurdyka-Łojasiewicz inequality and its applications to linear convergence of first-order methods
- A projection method for least-squares solutions to overdetermined systems of linear inequalities
- The convex geometry of linear inverse problems
- Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
- Global error bounds for piecewise convex polynomials
- Prox-regularity of rank constraint sets and implications for algorithms
- Nonlinear local error bounds via a change of metric
- Splitting methods with variable metric for Kurdyka-Łojasiewicz functions and general convergence rates
- On damped second-order gradient systems
- On proximity of Rayleigh quotients for different vectors and Ritz values generated by different trial subspaces
- Linear convergence of first order methods for non-strongly convex optimization
- On flow-invariant sets
- Conditioning convex and nonconvex problems
- On early stopping in gradient descent learning
- Convex Optimization in Normed Spaces
- Activity Identification and Local Linear Convergence of Forward--Backward-type Methods
- Certifying the Restricted Isometry Property is Hard
- Identifiable Surfaces in Constrained Optimization
- Weak Sharp Minima in Mathematical Programming
- Proximal Alternating Minimization and Projection Methods for Nonconvex Problems: An Approach Based on the Kurdyka-Łojasiewicz Inequality
- Curvature Measures
- Asymptotic Convergence Analysis of the Proximal Point Algorithm
- Clarke Subgradients of Stratifiable Functions
- Implicit Functions and Solution Mappings
- Characterizations of Łojasiewicz inequalities: Subgradient flows, talweg, convexity
- Applications of the method of partial inverses to convex programming: Decomposition
- On the Convergence of the Proximal Point Algorithm for Convex Minimization
- On the Linear Convergence of Descent Methods for Convex Essentially Smooth Minimization
- Monotone Operators and the Proximal Point Algorithm
- Upper-Lipschitz multifunctions and inverse subdifferentials
- Sensitivity Analysis for Mirror-Stratifiable Convex Functions
- Model Consistency of Partly Smooth Regularizers
- An iterative thresholding algorithm for linear inverse problems with a sparsity constraint
- The radius of metric regularity
- Active Sets, Nonsmoothness, and Sensitivity
- Error Bounds for Piecewise Convex Quadratic Programs and Applications
- Prox-regular functions in variational analysis
- Hölder Metric Subregularity with Applications to Proximal Point Method
- Thresholding gradient methods in Hilbert spaces: support identification and linear convergence
- Second-order growth, tilt stability, and metric regularity of the subdifferential
- Quantitative Stability of Variational Systems II. A Framework for Nonlinear Conditioning
- The Variable Metric Forward-Backward Splitting Algorithm Under Mild Differentiability Assumptions
- Alternating Projections on Manifolds
- The Łojasiewicz Inequality for Nonsmooth Subanalytic Functions with Applications to Subgradient Dynamical Systems
- Convergence of the Iterates of Descent Methods for Analytic Cost Functions
- An Asynchronous Parallel Stochastic Coordinate Descent Algorithm
- On a characterization of flow‐invariant sets
- Accelerated Iterative Regularization via Dual Diagonal Descent