Exact gradient methods with memory
From MaRDI portal
Publication:5058416
DOI10.1080/10556788.2022.2091559OpenAlexW4286008053MaRDI QIDQ5058416
Publication date: 20 December 2022
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/10556788.2022.2091559
bundleaccelerationgradient methodBregman distancerelative smoothnesspiece-wise linear modelcomposite problems
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- First-order methods of smooth convex optimization with inexact oracle
- Fast first-order methods for composite convex optimization with backtracking
- On the convergence of the iterates of the ``fast iterative shrinkage/thresholding algorithm
- Primal-dual first-order methods with \({\mathcal {O}(1/\varepsilon)}\) iteration-complexity for cone programming
- Introductory lectures on convex optimization. A basic course.
- Complexity bounds for primal-dual methods minimizing the model of objective function
- Accelerated Bregman proximal gradient methods for relatively smooth convex optimization
- Linear convergence of first order methods for non-strongly convex optimization
- MM Optimization Algorithms
- Efficiency of the Accelerated Coordinate Descent Method on Structured Optimization Problems
- Convergence Analysis of a Proximal-Like Minimization Algorithm Using Bregman Functions
- An Optimal First Order Method Based on Optimal Quadratic Averaging
- Relatively Smooth Convex Optimization by First-Order Methods, and Applications
- An Accelerated Composite Gradient Method for Large-Scale Composite Objective Problems
- A Generalized Accelerated Composite Gradient Method: Uniting Nesterov's Fast Gradient Method and FISTA
- A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications
This page was built for publication: Exact gradient methods with memory