Accelerated First-Order Methods for Convex Optimization with Locally Lipschitz Continuous Gradient
From MaRDI portal
Publication:6046830
Abstract: In this paper we develop accelerated first-order methods for convex optimization with locally Lipschitz continuous gradient (LLCG), which is beyond the well-studied class of convex optimization with Lipschitz continuous gradient. In particular, we first consider unconstrained convex optimization with LLCG and propose accelerated proximal gradient (APG) methods for solving it. The proposed APG methods are equipped with a verifiable termination criterion and enjoy an operation complexity of and for finding an -residual solution of an unconstrained convex and strongly convex optimization problem, respectively. We then consider constrained convex optimization with LLCG and propose an first-order proximal augmented Lagrangian method for solving it by applying one of our proposed APG methods to approximately solve a sequence of proximal augmented Lagrangian subproblems. The resulting method is equipped with a verifiable termination criterion and enjoys an operation complexity of and for finding an -KKT solution of a constrained convex and strongly convex optimization problem, respectively. All the proposed methods in this paper are parameter-free or almost parameter-free except that the knowledge on convexity parameter is required. In addition, preliminary numerical results are presented to demonstrate the performance of our proposed methods. To the best of our knowledge, no prior studies were conducted to investigate accelerated first-order methods with complexity guarantees for convex optimization with LLCG. All the complexity results obtained in this paper are new.
Recommendations
- Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity
- A note on the (accelerated) proximal gradient method for composite convex optimization
- A note on the accelerated proximal gradient method for nonconvex optimization
- Accelerated methods for nonconvex optimization
- Nearly optimal first-order methods for convex optimization under gradient norm measure: an adaptive regularization approach
Cites work
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- A Modified Forward-Backward Splitting Method for Maximal Monotone Mappings
- A descent lemma beyond Lipschitz gradient continuity: first-order methods revisited and applications
- A forward-backward splitting method for monotone inclusions without cocoercivity
- Accelerated Bregman proximal gradient methods for relatively smooth convex optimization
- Accelerated, parallel, and proximal coordinate descent
- Adaptive inexact fast augmented Lagrangian methods for constrained convex optimization
- An accelerated randomized proximal coordinate gradient method and its application to regularized empirical risk minimization
- Complexity of Variants of Tseng's Modified F-B Splitting and Korpelevich's Methods for Hemivariational Inequalities with Applications to Saddle-point and Convex Optimization Problems
- Complexity of first-order inexact Lagrangian and penalty methods for conic convex programming
- Gradient methods for minimizing composite functions
- Iteration-Complexity of First-Order Augmented Lagrangian Methods for Convex Conic Programming
- Iteration-complexity of first-order augmented Lagrangian methods for convex programming
- Iteration-complexity of first-order penalty methods for convex programming
- Monotone Operators and the Proximal Point Algorithm
- Nearly optimal first-order methods for convex optimization under gradient norm measure: an adaptive regularization approach
- On the nonergodic convergence rate of an inexact augmented Lagrangian framework for composite convex programming
- Relatively smooth convex optimization by first-order methods, and applications
Cited in
(3)- Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity
- Proximal gradient algorithms under local Lipschitz gradient continuity. A convergence and robustness analysis of PANOC
- Robustness of Accelerated First-Order Algorithms for Strongly Convex Optimization Problems
This page was built for publication: Accelerated First-Order Methods for Convex Optimization with Locally Lipschitz Continuous Gradient
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6046830)