Parameter-free FISTA by adaptive restart and backtracking
From MaRDI portal
Publication:6622751
DOI10.1137/23M158961XMaRDI QIDQ6622751FDOQ6622751
Aude Rondepierre, Hippolyte Labarrière, Charles Dossal, Luca Calatroni, Jean-François Aujol
Publication date: 22 October 2024
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Numerical optimization and variational techniques (65K10) Convex programming (90C25) Nonlinear programming (90C30) Applications of functional analysis in optimization, convex analysis, mathematical programming, economics (46N10)
Cites Work
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- iPiasco: inertial proximal algorithm for strongly convex optimization
- Introductory lectures on convex optimization. A basic course.
- Adaptive restart for accelerated gradient schemes
- Gradient methods for minimizing composite functions
- First-Order Methods in Optimization
- Fast Gradient-Based Algorithms for Constrained Total Variation Image Denoising and Deblurring Problems
- Signal Recovery by Proximal Forward-Backward Splitting
- Accelerated and inexact forward-backward algorithms
- This is SPIRAL-TAP: Sparse Poisson Intensity Reconstruction ALgorithms—Theory and Practice
- Fine tuning Nesterov's steepest descent algorithm for differentiable convex programming
- Some methods of speeding up the convergence of iteration methods
- On the convergence of the iterates of the ``fast iterative shrinkage/thresholding algorithm
- An introduction to continuous optimization for imaging
- Fast first-order methods for composite convex optimization with backtracking
- Backtracking Strategies for Accelerated Descent Methods with Smooth Composite Objectives
- From error bounds to the complexity of first-order descent methods for convex functions
- Restricted strong convexity and its applications to convergence analysis of gradient-type methods in convex optimization
- Linear convergence of first order methods for non-strongly convex optimization
- The rate of convergence of Nesterov's accelerated forward-backward method is actually faster than \(1/k^2\)
- Inertial Variable Metric Techniques for the Inexact Forward--Backward Algorithm
- A scaled and adaptive FISTA algorithm for signal-dependent sparse image super-resolution problems
- Activity Identification and Local Linear Convergence of Forward--Backward-type Methods
- Convergence Rates of Inertial Forward-Backward Algorithms
- Convergence of the forward-backward algorithm: beyond the worst-case with the help of geometry
- Adaptive restart of accelerated gradient methods under local quadratic growth condition
- An adaptive accelerated proximal gradient method and its homotopy continuation for sparse optimization
- Sharpness, Restart, and Acceleration
- Improving “Fast Iterative Shrinkage-Thresholding Algorithm”: Faster, Smarter, and Greedier
- Scaled, Inexact, and Adaptive Generalized FISTA for Strongly Convex Optimization
- Convergence rates of the heavy-ball method under the Łojasiewicz property
- Accelerated Iterative Regularization via Dual Diagonal Descent
- A Generalized Accelerated Composite Gradient Method: Uniting Nesterov's Fast Gradient Method and FISTA
- Restart of Accelerated First-Order Methods With Linear Convergence Under a Quadratic Functional Growth Condition
- FISTA is an automatic geometrically optimized algorithm for strongly convex functions
This page was built for publication: Parameter-free FISTA by adaptive restart and backtracking
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6622751)