Convergence rates of accelerated proximal gradient algorithms under independent noise
DOI10.1007/S11075-018-0565-4zbMATH Open1420.90069OpenAlexW2811382114WikidataQ129647654 ScholiaQ129647654MaRDI QIDQ2420162FDOQ2420162
Authors: Tao Sun, Roberto Barrio, Hao Jiang, Li-zhi Cheng
Publication date: 5 June 2019
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11075-018-0565-4
Recommendations
- Convergence rates of proximal gradient methods via the convex conjugate
- On stochastic accelerated gradient with convergence rate
- Convergence of stochastic proximal gradient algorithm
- Global convergence rate of proximal incremental aggregated gradient methods
- On convergence rates of proximal alternating direction method of multipliers
- Convergence rate analysis of proximal gradient methods with applications to composite minimization problems
- Generalized Nesterov's accelerated proximal gradient algorithms with convergence rate of order \(o(1/k^2)\)
- A note on the accelerated proximal gradient method for nonconvex optimization
- Accelerated gradient methods with absolute and relative noise in the gradient
- On the rate of convergence of the proximal alternating linearized minimization algorithm for convex problems
Nonconvex programming, global optimization (90C26) Nonlinear programming (90C30) Applications of operator theory in optimization, convex analysis, mathematical programming, economics (47N10)
Cites Work
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Adaptive subgradient methods for online learning and stochastic optimization
- Exact matrix completion via convex optimization
- Incremental majorization-minimization optimization with application to large-scale machine learning
- Fast Gradient-Based Algorithms for Constrained Total Variation Image Denoising and Deblurring Problems
- Compressed sensing
- A hybrid approximate extragradient-proximal point algorithm using the enlargement of a maximal monotone operator
- Statistical inverse problems: discretization, model reduction and inverse crimes
- An accelerated hybrid proximal extragradient method for convex optimization and its implications to second-order methods
- Accelerated and inexact forward-backward algorithms
- Fixed-Point Continuation for $\ell_1$-Minimization: Methodology and Convergence
- An EM algorithm for wavelet-based image restoration
- First-order methods of smooth convex optimization with inexact oracle
- An iterative thresholding algorithm for linear inverse problems with a sparsity constraint
- An inexact accelerated proximal gradient method for large scale linearly constrained convex SDP
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Coordinate descent algorithms
- Nonlinear wavelet image processing: variational problems, compression, and noise removal through wavelet shrinkage
- A proximal stochastic gradient method with progressive variance reduction
- Inexact and accelerated proximal point algorithms
- Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization
- Convergence of a proximal point method in the presence of computational errors in Hilbert spaces
- Numerical methods for nondifferentiable convex optimization
- Coupling the proximal point algorithm with approximation methods
- Information-Theoretic Lower Bounds on the Oracle Complexity of Stochastic Convex Optimization
- Minimizing finite sums with the stochastic average gradient
- A new convergence analysis and perturbation resilience of some accelerated proximal forward-backward algorithms with errors
- Sparse wavelet representations of spatially varying blurring operators
- Precondition techniques for accelerated linearized Bregman algorithms
Cited In (4)
- Catalyst acceleration for first-order convex optimization: from theory to practice
- On the interplay between acceleration and identification for the proximal gradient algorithm
- A new convergence analysis and perturbation resilience of some accelerated proximal forward-backward algorithms with errors
- Accelerated gradient methods with absolute and relative noise in the gradient
Uses Software
This page was built for publication: Convergence rates of accelerated proximal gradient algorithms under independent noise
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2420162)