Accelerated Bregman proximal gradient methods for relatively smooth convex optimization
From MaRDI portal
Publication:2044481
Abstract: We consider the problem of minimizing the sum of two convex functions: one is differentiable and relatively smooth with respect to a reference convex function, and the other can be nondifferentiable but simple to optimize. We investigate a triangle scaling property of the Bregman distance generated by the reference convex function and present accelerated Bregman proximal gradient (ABPG) methods that attain an convergence rate, where is the triangle scaling exponent (TSE) of the Bregman distance. For the Euclidean distance, we have and recover the convergence rate of Nesterov's accelerated gradient methods. For non-Euclidean Bregman distances, the TSE can be much smaller (say ), but we show that a relaxed definition of intrinsic TSE is always equal to 2. We exploit the intrinsic TSE to develop adaptive ABPG methods that converge much faster in practice. Although theoretical guarantees on a fast convergence rate seem to be out of reach in general, our methods obtain empirical rates in numerical experiments on several applications and provide posterior numerical certificates for the fast rates.
Recommendations
- A dual Bregman proximal gradient method for relatively-strongly convex optimization
- Generalized Nesterov's accelerated proximal gradient algorithms with convergence rate of order \(o(1/k^2)\)
- A simple convergence analysis of Bregman proximal gradient algorithm
- A note on the (accelerated) proximal gradient method for composite convex optimization
- On the linear convergence of a Bregman proximal point algorithm
Cites work
- scientific article; zbMATH DE number 3850830 (Why is no real title available?)
- scientific article; zbMATH DE number 4079168 (Why is no real title available?)
- scientific article; zbMATH DE number 3790208 (Why is no real title available?)
- scientific article; zbMATH DE number 3296905 (Why is no real title available?)
- scientific article; zbMATH DE number 3073200 (Why is no real title available?)
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- A descent lemma beyond Lipschitz gradient continuity: first-order methods revisited and applications
- A simple convergence analysis of Bregman proximal gradient algorithm
- A simplified view of first order methods for optimization
- Adaptive restart for accelerated gradient schemes
- An iterative row-action method for interval convex programming
- Convergence Analysis of a Proximal-Like Minimization Algorithm Using Bregman Functions
- Convex Analysis
- First-order methods in optimization
- Gradient methods for minimizing composite functions
- Image deblurring with Poisson data: from cells to galaxies
- Implementable tensor methods in unconstrained convex optimization
- Interior Gradient and Proximal Methods for Convex and Conic Optimization
- Introductory lectures on convex optimization. A basic course.
- Joint and separate convexity of the Bregman distance.
- Optimal and Efficient Designs of Experiments
- Optimum Designs in Regression Problems
- Proximal minimization algorithm with \(D\)-functions
- Relatively smooth convex optimization by first-order methods, and applications
- Smooth minimization of non-smooth functions
- Universal gradient methods for convex optimization problems
- Why least squares and maximum entropy? An axiomatic approach to inference for linear inverse problems
Cited in
(27)- Bregman proximal gradient algorithms for deep matrix factorization
- A telescopic Bregmanian proximal gradient method without the global Lipschitz continuity assumption
- The rate of convergence of Bregman proximal methods: local geometry versus regularity versus sharpness
- A Bregman proximal subgradient algorithm for nonconvex and nonsmooth fractional optimization problems
- First-order methods for convex optimization
- Data-Driven Mirror Descent with Input-Convex Neural Networks
- Accelerated Bregman operator splitting with backtracking
- Composite optimization by nonconvex majorization-minimization
- Reducing the Complexity of Two Classes of Optimization Problems by Inexact Accelerated Proximal Gradient Method
- Optimal complexity and certification of Bregman first-order methods
- Accelerated gradient methods combining Tikhonov regularization with geometric damping driven by the Hessian
- Dual space preconditioning for gradient descent
- Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity
- Local convexity of the TAP free energy and AMP convergence for \(\mathbb{Z}_2\)-synchronization
- Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization
- Accelerated First-Order Methods for Convex Optimization with Locally Lipschitz Continuous Gradient
- Convergence Analysis for Bregman Iterations in Minimizing a Class of Landau Free Energy Functionals
- A proximal ADMM with the Broyden family for convex optimization problems
- A mirror inertial forward-reflected-backward splitting: convergence analysis beyond convexity and Lipschitz smoothness
- Bregman proximal point algorithm revisited: a new inexact version and its inertial variant
- Bregman proximal mappings and Bregman-Moreau envelopes under relative prox-regularity
- Perturbed Fenchel duality and first-order methods
- Stochastic composition optimization of functions without Lipschitz continuous gradient
- Approximate Bregman proximal gradient algorithm for relatively smooth nonconvex optimization
- Hyperfast second-order local solvers for efficient statistically preconditioned distributed optimization
- Exact gradient methods with memory
- A dual Bregman proximal gradient method for relatively-strongly convex optimization
This page was built for publication: Accelerated Bregman proximal gradient methods for relatively smooth convex optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2044481)