Fast augmented Lagrangian method in the convex regime with convergence guarantees for the iterates
DOI10.1007/s10107-022-01879-4zbMath1519.37068arXiv2111.09370WikidataQ114228405 ScholiaQ114228405MaRDI QIDQ6044978
Radu Ioan Boţ, Dang-Khoa Nguyen, Ernö Robert Csetnek
Publication date: 25 May 2023
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2111.09370
convergence ratesaugmented Lagrangian methoditerates convergenceNesterov's fast gradient methodprimal-dual numerical algorithm
Analysis of algorithms and problem complexity (68Q25) Numerical mathematical programming methods (65K05) Convex programming (90C25) Numerical methods involving duality (49M29) Optimality conditions for problems involving ordinary differential equations (49K15) Approximation methods and numerical treatment of dynamical systems (37M99) Action-minimizing orbits and measures for finite-dimensional Hamiltonian and Lagrangian systems; variational principles; degree-theoretic methods (37J51)
Related Items
Cites Work
- Unnamed Item
- Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- On the ergodic convergence rates of a first-order primal-dual algorithm
- On the convergence of the iterates of the ``fast iterative shrinkage/thresholding algorithm
- Conjugate duality in convex optimization
- Introductory lectures on convex optimization. A basic course.
- Fast convergence of dynamical ADMM via time scaling of damped inertial dynamics
- Inertial accelerated primal-dual methods for linear equality constrained convex optimization problems
- Applying FISTA to optimization problems (with or) without minimizers
- Improved convergence rates and trajectory convergence for primal-dual dynamical systems with vanishing damping
- Accelerated alternating direction method of multipliers: an optimal \(O(1 / K)\) nonergodic analysis
- The Rate of Convergence of Nesterov's Accelerated Forward-Backward Method is Actually Faster Than $1/k^2$
- A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
- Convergence Rates of Inertial Primal-Dual Dynamical Methods for Separable Convex Optimization Problems
- Applications of a Splitting Algorithm to Decomposition in Convex Programming and Variational Inequalities
- A Smooth Primal-Dual Optimization Framework for Nonsmooth Composite Convex Minimization
- Convergence Rates of Inertial Forward-Backward Algorithms
- Accelerated First-Order Primal-Dual Proximal Methods for Linearly Constrained Composite Convex Programming
- Faster Lagrangian-Based Methods in Convex Optimization
- Fast Alternating Direction Optimization Methods
- Optimal Primal-Dual Methods for a Class of Saddle Point Problems
- An Accelerated Linearized Alternating Direction Method of Multipliers
- Convex analysis and monotone operator theory in Hilbert spaces