Publication:2023654: Difference between revisions
From MaRDI portal
Publication:2023654
Created automatically from import240129110113 ย |
(No difference)
|
Latest revision as of 18:40, 1 February 2024
DOI10.1007/S10589-020-00218-7zbMATH Open1466.90097arXiv1909.08944OpenAlexW3103865444MaRDI QIDQ2023654FDOQ2023654
Franck Iutzeler, Gilles Bareilles
Publication date: 3 May 2021
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Abstract: In this paper, we study the interplay between acceleration and structure identification for the proximal gradient algorithm. We report and analyze several cases where this interplay has negative effects on the algorithm behavior (iterates oscillation, loss of structure, etc.). We present a generic method that tames acceleration when structure identification may be at stake; it benefits from a convergence rate that matches the one of the accelerated proximal gradient under some qualifying condition. We show empirically that the proposed method is much more stable in terms of subspace identification compared to the accelerated proximal gradient method while keeping a similar functional decrease.
Full work available at URL: https://arxiv.org/abs/1909.08944
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Julia: A Fresh Approach to Numerical Computing
- Convex analysis and monotone operator theory in Hilbert spaces
- Adaptive restart for accelerated gradient schemes
- First-Order Methods in Optimization
- Stable signal recovery from incomplete and inaccurate measurements
- Fast Gradient-Based Algorithms for Constrained Total Variation Image Denoising and Deblurring Problems
- De-noising by soft-thresholding
- An inertial proximal method for maximal monotone operators via discretization of a nonlinear oscillator with damping
- On the Identification of Active Constraints
- The ลojasiewicz Inequality for Nonsmooth Subanalytic Functions with Applications to Subgradient Dynamical Systems
- Geometrical interpretation of the predictor-corrector type algorithms in structured optimization problems
- On the Goldstein-Levitin-Polyak gradient projection method
- Active Sets, Nonsmoothness, and Sensitivity
- Activity Identification and Local Linear Convergence of DouglasโRachford/ADMM under Partial Smoothness
- Some methods of speeding up the convergence of iteration methods
- On the convergence of the iterates of the ``fast iterative shrinkage/thresholding algorithm
- Fast first-order methods for composite convex optimization with backtracking
- Optimality, identifiability, and sensitivity
- On the proximal gradient algorithm with alternated inertia
- From error bounds to the complexity of first-order descent methods for convex functions
- The rate of convergence of Nesterov's accelerated forward-backward method is actually faster than \(1/k^2\)
- Low Complexity Regularization of Linear Inverse Problems
- Activity Identification and Local Linear Convergence of Forward--Backward-type Methods
- Sensitivity Analysis for Mirror-Stratifiable Convex Functions
- Model Consistency of Partly Smooth Regularizers
- Convergence rate of inertial forward-backward algorithm beyond Nesterov's rule
- A generic online acceleration scheme for optimization algorithms via relaxation and inertia
Cited In (3)
Uses Software
Recommendations
- Title not available (Why is that?) ๐ ๐
- Title not available (Why is that?) ๐ ๐
- Accelerated, Parallel, and Proximal Coordinate Descent ๐ ๐
- On the Identification Property of a Projected Gradient Method ๐ ๐
- Modified stochastic gradient identification algorithms with fast convergence rates ๐ ๐
- On the proximal gradient algorithm with alternated inertia ๐ ๐
- An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization ๐ ๐
- Accelerating block coordinate descent methods with identification strategies ๐ ๐
- Provable accelerated gradient method for nonconvex low rank optimization ๐ ๐
- Convergence rates of accelerated proximal gradient algorithms under independent noise ๐ ๐
This page was built for publication: On the interplay between acceleration and identification for the proximal gradient algorithm
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2023654)