Accelerated Bregman proximal gradient methods for relatively smooth convex optimization
DOI10.1007/S10589-021-00273-8zbMATH Open1473.90114arXiv1808.03045OpenAlexW3140735157MaRDI QIDQ2044481FDOQ2044481
Authors: Filip Hanzely, Peter Richtárik, Lin Xiao
Publication date: 9 August 2021
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1808.03045
Recommendations
- A dual Bregman proximal gradient method for relatively-strongly convex optimization
- Generalized Nesterov's accelerated proximal gradient algorithms with convergence rate of order \(o(1/k^2)\)
- A simple convergence analysis of Bregman proximal gradient algorithm
- A note on the (accelerated) proximal gradient method for composite convex optimization
- On the linear convergence of a Bregman proximal point algorithm
convex optimizationBregman divergenceproximal gradient methodsrelative smoothnessaccelerated gradient methods
Cites Work
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Smooth minimization of non-smooth functions
- Introductory lectures on convex optimization. A basic course.
- Adaptive restart for accelerated gradient schemes
- Title not available (Why is that?)
- Gradient methods for minimizing composite functions
- First-order methods in optimization
- Title not available (Why is that?)
- Title not available (Why is that?)
- Convex Analysis
- Optimum Designs in Regression Problems
- Optimal and Efficient Designs of Experiments
- An iterative row-action method for interval convex programming
- Why least squares and maximum entropy? An axiomatic approach to inference for linear inverse problems
- Image deblurring with Poisson data: from cells to galaxies
- Title not available (Why is that?)
- Convergence Analysis of a Proximal-Like Minimization Algorithm Using Bregman Functions
- Proximal minimization algorithm with \(D\)-functions
- Universal gradient methods for convex optimization problems
- Interior Gradient and Proximal Methods for Convex and Conic Optimization
- Title not available (Why is that?)
- Joint and separate convexity of the Bregman distance.
- Relatively smooth convex optimization by first-order methods, and applications
- A descent lemma beyond Lipschitz gradient continuity: first-order methods revisited and applications
- A simplified view of first order methods for optimization
- Implementable tensor methods in unconstrained convex optimization
- A simple convergence analysis of Bregman proximal gradient algorithm
Cited In (27)
- Bregman proximal gradient algorithms for deep matrix factorization
- A telescopic Bregmanian proximal gradient method without the global Lipschitz continuity assumption
- The rate of convergence of Bregman proximal methods: local geometry versus regularity versus sharpness
- A Bregman proximal subgradient algorithm for nonconvex and nonsmooth fractional optimization problems
- First-order methods for convex optimization
- Data-Driven Mirror Descent with Input-Convex Neural Networks
- Accelerated Bregman operator splitting with backtracking
- Composite optimization by nonconvex majorization-minimization
- Reducing the Complexity of Two Classes of Optimization Problems by Inexact Accelerated Proximal Gradient Method
- Accelerated gradient methods combining Tikhonov regularization with geometric damping driven by the Hessian
- Dual space preconditioning for gradient descent
- Optimal complexity and certification of Bregman first-order methods
- Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity
- Local convexity of the TAP free energy and AMP convergence for \(\mathbb{Z}_2\)-synchronization
- Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization
- Convergence Analysis for Bregman Iterations in Minimizing a Class of Landau Free Energy Functionals
- Accelerated First-Order Methods for Convex Optimization with Locally Lipschitz Continuous Gradient
- A proximal ADMM with the Broyden family for convex optimization problems
- A mirror inertial forward-reflected-backward splitting: convergence analysis beyond convexity and Lipschitz smoothness
- Bregman proximal point algorithm revisited: a new inexact version and its inertial variant
- Stochastic composition optimization of functions without Lipschitz continuous gradient
- Approximate Bregman proximal gradient algorithm for relatively smooth nonconvex optimization
- Perturbed Fenchel duality and first-order methods
- Bregman proximal mappings and Bregman-Moreau envelopes under relative prox-regularity
- Hyperfast second-order local solvers for efficient statistically preconditioned distributed optimization
- Exact gradient methods with memory
- A dual Bregman proximal gradient method for relatively-strongly convex optimization
Uses Software
This page was built for publication: Accelerated Bregman proximal gradient methods for relatively smooth convex optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2044481)