An inexact proximal augmented Lagrangian framework with arbitrary linearly convergent inner solver for composite convex optimization
From MaRDI portal
Publication:2062324
DOI10.1007/s12532-021-00205-xzbMath1476.90193arXiv1909.09582OpenAlexW3186030596MaRDI QIDQ2062324
Publication date: 27 December 2021
Published in: Mathematical Programming Computation (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1909.09582
large scale optimizationexplicit inner termination ruleinexact augmented Lagrangian methodrandomized first-order methodrelative smoothness condition
Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Numerical methods based on nonlinear programming (49M37)
Related Items
First-Order Methods for Problems with $O$(1) Functional Constraints Can Have Almost the Same Convergence Rate as for Unconstrained Problems, Stochastic inexact augmented Lagrangian method for nonconvex expectation constrained optimization
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Smooth minimization of non-smooth functions
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Efficient evaluation of scaled proximal operators
- Adaptive inexact fast augmented Lagrangian methods for constrained convex optimization
- Accelerated primal-dual proximal block coordinate updating methods for constrained convex optimization
- A first-order primal-dual algorithm for convex problems with applications to imaging
- An adaptive primal-dual framework for nonsmooth convex minimization
- Restarting the accelerated coordinate descent method with a rough strong convexity estimate
- Efficiency of minimizing compositions of convex functions and smooth maps
- Linear convergence of first order methods for non-strongly convex optimization
- Interior projection-like methods for monotone variational inequalities
- Smoothing and First Order Methods: A Unified Framework
- The Baillon-Haddad Theorem Revisited
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Accelerated, Parallel, and Proximal Coordinate Descent
- Monotone Operators and the Proximal Point Algorithm
- Augmented Lagrangians and Applications of the Proximal Point Algorithm in Convex Programming
- Atomic Decomposition by Basis Pursuit
- Suboptimal model predictive control (feasibility implies stability)
- A Smooth Primal-Dual Optimization Framework for Nonsmooth Composite Convex Minimization
- Relatively Smooth Convex Optimization by First-Order Methods, and Applications
- Complexity of first-order inexact Lagrangian and penalty methods for conic convex programming
- Sparsity and Smoothness Via the Fused Lasso
- Stochastic Primal-Dual Hybrid Gradient Algorithm with Arbitrary Sampling and Imaging Applications
- On the Complexity Analysis of the Primal Solutions for the Accelerated Randomized Dual Coordinate Ascent
- Katyusha: the first direct acceleration of stochastic gradient methods
- Computational Complexity of Inexact Gradient Augmented Lagrangian Methods: Application to Constrained MPC
- On the Nonergodic Convergence Rate of an Inexact Augmented Lagrangian Framework for Composite Convex Programming
- Adaptive restart of accelerated gradient methods under local quadratic growth condition
- An Accelerated Linearized Alternating Direction Method of Multipliers
- A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications
- Iteration-complexity of first-order augmented Lagrangian methods for convex programming