Some modified fast iterative shrinkage thresholding algorithms with a new adaptive non-monotone stepsize strategy for nonsmooth and convex minimization problems
Publication:2082554
DOI10.1007/s10589-022-00396-6zbMath1502.90130OpenAlexW4288081391MaRDI QIDQ2082554
Publication date: 4 October 2022
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10589-022-00396-6
convex optimizationconvergence rateFISTAinertial forward-backward algorithmsadaptive non-monotone stepsize strategyproximal-based method
Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Numerical optimization and variational techniques (65K10) Signal theory (characterization, reconstruction, filtering, etc.) (94A12) Image processing (compression, reconstruction, etc.) in information and communication theory (94A08)
Related Items (1)
Uses Software
Cites Work
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Convergence rates with inexact non-expansive operators
- Gradient methods for minimizing composite functions
- Fast first-order methods for composite convex optimization with backtracking
- On the convergence of the iterates of the ``fast iterative shrinkage/thresholding algorithm
- Approximation accuracy, gradient methods, and error bound for structured convex optimization
- An inertial forward-backward algorithm for monotone inclusions
- An algorithm for total variation minimization and applications
- New error bounds and their applications to convergence analysis of iterative algorithms
- Templates for convex cone problems with applications to sparse signal recovery
- Convergence results of a new monotone inertial forward-backward splitting algorithm under the local Hölder error bound condition
- Convergence rate of inertial forward-backward algorithm beyond Nesterov's rule
- Convergence rates of forward-Douglas-Rachford splitting method
- Adaptive restart for accelerated gradient schemes
- The Rate of Convergence of Nesterov's Accelerated Forward-Backward Method is Actually Faster Than $1/k^2$
- On the convergence of the forward–backward splitting method with linesearches
- A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
- Proximal Splitting Methods in Signal Processing
- Local Linear Convergence of ISTA and FISTA on the LASSO Problem
- Linear Convergence of Proximal Gradient Algorithm with Extrapolation for a Class of Nonconvex Nonsmooth Minimization Problems
- Splitting Algorithms for the Sum of Two Nonlinear Operators
- Error Bound and Convergence Analysis of Matrix Splitting Algorithms for the Affine Variational Inequality Problem
- Sparse Reconstruction by Separable Approximation
- Convergence Rates of Inertial Forward-Backward Algorithms
- A Linearly Convergent Dual-Based Gradient Projection Algorithm for Quadratically Constrained Convex Minimization
- Signal Recovery by Proximal Forward-Backward Splitting
- Compressed sensing
This page was built for publication: Some modified fast iterative shrinkage thresholding algorithms with a new adaptive non-monotone stepsize strategy for nonsmooth and convex minimization problems