Accelerated First-Order Methods for Convex Optimization with Locally Lipschitz Continuous Gradient
DOI10.1137/22M1500496zbMATH Open1522.90101arXiv2206.01209OpenAlexW4386291915MaRDI QIDQ6046830FDOQ6046830
Authors: Zhaosong Lu
Publication date: 6 September 2023
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2206.01209
Recommendations
- Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity
- A note on the (accelerated) proximal gradient method for composite convex optimization
- A note on the accelerated proximal gradient method for nonconvex optimization
- Accelerated methods for nonconvex optimization
- Nearly optimal first-order methods for convex optimization under gradient norm measure: an adaptive regularization approach
convex optimizationproximal gradient methoditeration complexityproximal augmented Lagrangian methodaccelerated first-order methodsoperation complexitylocally Lipschitz continuous gradient
Convex programming (90C25) Optimality conditions and duality in mathematical programming (90C46) Nonlinear programming (90C30) Numerical methods based on nonlinear programming (49M37)
Cites Work
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Gradient methods for minimizing composite functions
- Complexity of Variants of Tseng's Modified F-B Splitting and Korpelevich's Methods for Hemivariational Inequalities with Applications to Saddle-point and Convex Optimization Problems
- Monotone Operators and the Proximal Point Algorithm
- Iteration-complexity of first-order penalty methods for convex programming
- Accelerated, parallel, and proximal coordinate descent
- A Modified Forward-Backward Splitting Method for Maximal Monotone Mappings
- A forward-backward splitting method for monotone inclusions without cocoercivity
- Iteration-complexity of first-order augmented Lagrangian methods for convex programming
- Complexity of first-order inexact Lagrangian and penalty methods for conic convex programming
- Adaptive inexact fast augmented Lagrangian methods for constrained convex optimization
- Relatively smooth convex optimization by first-order methods, and applications
- A descent lemma beyond Lipschitz gradient continuity: first-order methods revisited and applications
- An accelerated randomized proximal coordinate gradient method and its application to regularized empirical risk minimization
- Nearly optimal first-order methods for convex optimization under gradient norm measure: an adaptive regularization approach
- On the nonergodic convergence rate of an inexact augmented Lagrangian framework for composite convex programming
- Accelerated Bregman proximal gradient methods for relatively smooth convex optimization
- Iteration-Complexity of First-Order Augmented Lagrangian Methods for Convex Conic Programming
Cited In (3)
- Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity
- Proximal gradient algorithms under local Lipschitz gradient continuity. A convergence and robustness analysis of PANOC
- Robustness of Accelerated First-Order Algorithms for Strongly Convex Optimization Problems
This page was built for publication: Accelerated First-Order Methods for Convex Optimization with Locally Lipschitz Continuous Gradient
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6046830)