Convergence results of a new monotone inertial forward-backward splitting algorithm under the local Hölder error bound condition
From MaRDI portal
Publication:2128612
DOI10.1007/s00245-022-09859-yzbMath1490.90223OpenAlexW4223897307MaRDI QIDQ2128612
Publication date: 22 April 2022
Published in: Applied Mathematics and Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s00245-022-09859-y
optimizationrate of convergencelocal Hölder error bound conditionmonotone inertial forward-backward splitting algorithm
Convex programming (90C25) Numerical optimization and variational techniques (65K10) Computing methodologies for image processing (68U10) Signal theory (characterization, reconstruction, filtering, etc.) (94A12) Image processing (compression, reconstruction, etc.) in information and communication theory (94A08)
Related Items
Cites Work
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- On the convergence of the iterates of the ``fast iterative shrinkage/thresholding algorithm
- The restricted strong convexity revisited: analysis of equivalence to error bound and quadratic growth
- Weak sharp minima revisited. III: Error bounds for differentiable convex inclusions
- Error bounds and convergence analysis of feasible descent methods: A general approach
- On the convergence of the coordinate descent method for convex differentiable minimization
- From error bounds to the complexity of first-order descent methods for convex functions
- Convergence rates of an inertial gradient descent algorithm under growth and flatness conditions
- Convergence rates for an inertial algorithm of gradient type associated to a smooth non-convex minimization
- Convergence rate of inertial forward-backward algorithm beyond Nesterov's rule
- Faster subgradient methods for functions with Hölderian growth
- Restricted strong convexity and its applications to convergence analysis of gradient-type methods in convex optimization
- Adaptive restart for accelerated gradient schemes
- Linear convergence of first order methods for non-strongly convex optimization
- The Rate of Convergence of Nesterov's Accelerated Forward-Backward Method is Actually Faster Than $1/k^2$
- A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
- Accelerated and Inexact Forward-Backward Algorithms
- iPiano: Inertial Proximal Algorithm for Nonconvex Optimization
- Linear Convergence of Proximal Gradient Algorithm with Extrapolation for a Class of Nonconvex Nonsmooth Minimization Problems
- Proximal Alternating Minimization and Projection Methods for Nonconvex Problems: An Approach Based on the Kurdyka-Łojasiewicz Inequality
- Parallel Random Coordinate Descent Method for Composite Minimization: Convergence Analysis and Error Bounds
- Characterizations of Łojasiewicz inequalities: Subgradient flows, talweg, convexity
- On the Linear Convergence of Descent Methods for Convex Essentially Smooth Minimization
- Convergence Rates of Inertial Forward-Backward Algorithms
- Inertial Variable Metric Techniques for the Inexact Forward--Backward Algorithm
- Convergence of Inexact Forward--Backward Algorithms Using the Forward--Backward Envelope
- Sharpness, Restart, and Acceleration
- Error Bounds, Quadratic Growth, and Linear Convergence of Proximal Methods
- Fast Gradient-Based Algorithms for Constrained Total Variation Image Denoising and Deblurring Problems
- Signal Recovery by Proximal Forward-Backward Splitting
- Weak convergence of the sequence of successive approximations for nonexpansive mappings
- Convex analysis and monotone operator theory in Hilbert spaces