Accelerated Bregman proximal gradient methods for relatively smooth convex optimization
From MaRDI portal
Publication:2044481
DOI10.1007/s10589-021-00273-8zbMath1473.90114arXiv1808.03045OpenAlexW3140735157MaRDI QIDQ2044481
Peter Richtárik, Lin Xiao, Filip Hanzely
Publication date: 9 August 2021
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1808.03045
convex optimizationBregman divergenceproximal gradient methodsrelative smoothnessaccelerated gradient methods
Related Items (15)
Exact gradient methods with memory ⋮ Optimal complexity and certification of Bregman first-order methods ⋮ Bregman Proximal Point Algorithm Revisited: A New Inexact Version and Its Inertial Variant ⋮ Accelerated First-Order Methods for Convex Optimization with Locally Lipschitz Continuous Gradient ⋮ Stochastic composition optimization of functions without Lipschitz continuous gradient ⋮ Hyperfast second-order local solvers for efficient statistically preconditioned distributed optimization ⋮ Perturbed Fenchel duality and first-order methods ⋮ First-order methods for convex optimization ⋮ Data-Driven Mirror Descent with Input-Convex Neural Networks ⋮ Local convexity of the TAP free energy and AMP convergence for \(\mathbb{Z}_2\)-synchronization ⋮ Convergence Analysis for Bregman Iterations in Minimizing a Class of Landau Free Energy Functionals ⋮ Bregman proximal mappings and Bregman-Moreau envelopes under relative prox-regularity ⋮ Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity ⋮ Composite Optimization by Nonconvex Majorization-Minimization ⋮ A dual Bregman proximal gradient method for relatively-strongly convex optimization
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Smooth minimization of non-smooth functions
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Gradient methods for minimizing composite functions
- Universal gradient methods for convex optimization problems
- An iterative row-action method for interval convex programming
- Why least squares and maximum entropy? An axiomatic approach to inference for linear inverse problems
- Proximal minimization algorithm with \(D\)-functions
- Introductory lectures on convex optimization. A basic course.
- A simplified view of first order methods for optimization
- Implementable tensor methods in unconstrained convex optimization
- Adaptive restart for accelerated gradient schemes
- A simple convergence analysis of Bregman proximal gradient algorithm
- Convergence Analysis of a Proximal-Like Minimization Algorithm Using Bregman Functions
- Optimum Designs in Regression Problems
- Image deblurring with Poisson data: from cells to galaxies
- First-Order Methods in Optimization
- Relatively Smooth Convex Optimization by First-Order Methods, and Applications
- Interior Gradient and Proximal Methods for Convex and Conic Optimization
- Optimal and Efficient Designs of Experiments
- Convex Analysis
- A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications
This page was built for publication: Accelerated Bregman proximal gradient methods for relatively smooth convex optimization