On the interplay between acceleration and identification for the proximal gradient algorithm
From MaRDI portal
Publication:2023654
DOI10.1007/s10589-020-00218-7zbMath1466.90097arXiv1909.08944OpenAlexW3103865444MaRDI QIDQ2023654
Franck Iutzeler, Gilles Bareilles
Publication date: 3 May 2021
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1909.08944
Related Items (2)
Nonsmoothness in machine learning: specific structure, proximal identification, and applications ⋮ Newton acceleration on manifolds identified by proximal gradient methods
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Fast first-order methods for composite convex optimization with backtracking
- Optimality, identifiability, and sensitivity
- On the convergence of the iterates of the ``fast iterative shrinkage/thresholding algorithm
- From error bounds to the complexity of first-order descent methods for convex functions
- On the proximal gradient algorithm with alternated inertia
- Convergence rate of inertial forward-backward algorithm beyond Nesterov's rule
- Adaptive restart for accelerated gradient schemes
- Low Complexity Regularization of Linear Inverse Problems
- The Rate of Convergence of Nesterov's Accelerated Forward-Backward Method is Actually Faster Than $1/k^2$
- Julia: A Fresh Approach to Numerical Computing
- Activity Identification and Local Linear Convergence of Forward--Backward-type Methods
- Activity Identification and Local Linear Convergence of Douglas–Rachford/ADMM under Partial Smoothness
- Geometrical interpretation of the predictor-corrector type algorithms in structured optimization problems
- On the Identification of Active Constraints
- On the Goldstein-Levitin-Polyak gradient projection method
- Sensitivity Analysis for Mirror-Stratifiable Convex Functions
- Model Consistency of Partly Smooth Regularizers
- First-Order Methods in Optimization
- A generic online acceleration scheme for optimization algorithms via relaxation and inertia
- Active Sets, Nonsmoothness, and Sensitivity
- De-noising by soft-thresholding
- Fast Gradient-Based Algorithms for Constrained Total Variation Image Denoising and Deblurring Problems
- The Łojasiewicz Inequality for Nonsmooth Subanalytic Functions with Applications to Subgradient Dynamical Systems
- Stable signal recovery from incomplete and inaccurate measurements
- Some methods of speeding up the convergence of iteration methods
- Convex analysis and monotone operator theory in Hilbert spaces
- An inertial proximal method for maximal monotone operators via discretization of a nonlinear oscillator with damping
This page was built for publication: On the interplay between acceleration and identification for the proximal gradient algorithm