On the R-superlinear convergence of the KKT residuals generated by the augmented Lagrangian method for convex composite conic programming
From MaRDI portal
Publication:2330654
DOI10.1007/s10107-018-1300-6zbMath1423.90171arXiv1706.08800OpenAlexW2962772015WikidataQ129764526 ScholiaQ129764526MaRDI QIDQ2330654
Ying Cui, Kim-Chuan Toh, Defeng Sun
Publication date: 22 October 2019
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1706.08800
augmented Lagrangian methodquadratic growth conditionconvex composite conic programmingimplementable criteriaR-superlinear
Numerical mathematical programming methods (65K05) Semidefinite programming (90C22) Convex programming (90C25) Sensitivity, stability, parametric optimization (90C31)
Related Items
Solving the OSCAR and SLOPE Models Using a Semismooth Newton-Based Augmented Lagrangian Method, Perturbed augmented Lagrangian method framework with applications to proximal and smoothed variants, Augmented Lagrangian methods for convex matrix optimization problems, An investigation on semismooth Newton based augmented Lagrangian method for image restoration, Zero-norm regularized problems: equivalent surrogates, proximal MM method and statistical error bound, Strong Variational Sufficiency for Nonlinear Semidefinite Programming and Its Implications, A semismooth Newton based dual proximal point algorithm for maximum eigenvalue problem, An inexact projected gradient method with rounding and lifting by nonlinear programming for solving rank-one semidefinite relaxation of polynomial optimization, Superlinear convergence of the sequential quadratic method in constrained optimization, Local convergence analysis of augmented Lagrangian method for nonlinear semidefinite programming, A Proximal Point Dual Newton Algorithm for Solving Group Graphical Lasso Problems, An efficient augmented Lagrangian method with semismooth Newton solver for total generalized variation, Convergence of the augmented decomposition algorithm, The Linear and Asymptotically Superlinear Convergence Rates of the Augmented Lagrangian Method with a Practical Relative Error Criterion, Kurdyka-Łojasiewicz property of zero-norm composite functions, An efficient Hessian based algorithm for solving large-scale sparse group Lasso problems, Efficient Sparse Semismooth Newton Methods for the Clustered Lasso Problem, Augmented Lagrangian method for second-order cone programs under second-order sufficiency, An Inexact Augmented Lagrangian Method for Second-Order Cone Programming with Applications, A semismooth Newton-based augmented Lagrangian algorithm for density matrix least squares problems, On Degenerate Doubly Nonnegative Projection Problems, Hölderian Error Bounds and Kurdyka-Łojasiewicz Inequality for the Trust Region Subproblem
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Computing the nearest correlation matrix--a problem from finance
- Sufficient optimality conditions hold for almost all nonlinear semidefinite programs
- A practical relative error criterion for augmented Lagrangians
- SDPNAL+: a majorized semismooth Newton-CG augmented Lagrangian method for semidefinite programming with nonnegative constraints
- The augmented Lagrangian method for equality and inequality constraints in Hilbert spaces
- The rate of convergence of the augmented Lagrangian method for nonlinear semidefinite programming
- Metric subregularity and the proximal point method
- Extended convergence results for the method of multipliers for nonstrictly binding inequality constraints
- Use of augmented Lagrangian methods for the optimal control of obstacle problems
- Complementarity and nondegeneracy in semidefinite programming
- First and second order analysis of nonlinear semidefinite programs
- Sensitivity analysis of generalized equations
- Upper Lipschitz behavior of solutions to perturbed \(C^{1,1}\) programs
- Critical multipliers in variational systems via second-order generalized differentiation
- A unified approach to error bounds for structured convex optimization problems
- QSDPNAL: a two-phase augmented Lagrangian method for convex quadratic semidefinite programming
- Strong conical hull intersection property, bounded linear regularity, Jameson's property \((G)\), and error bounds in convex optimization
- Permeability estimation with the augmented Lagrangian method for a nonlinear diffusion equation
- A note on upper Lipschitz stability, error bounds, and critical multipliers for Lipschitz-continuous KKT systems
- Multiplier and gradient methods
- Local boundedness of nonlinear, monotone operators
- Bounds for the Distance to the Nearest Correlation Matrix
- Solving Nuclear Norm Regularized and Semidefinite Matrix Least Squares Problems with Linear Equality Constraints
- Local Convergence of Exact and Inexact Augmented Lagrangian Methods under the Second-Order Sufficient Optimality Condition
- Characterization of the Robust Isolated Calmness for a Class of Conic Programming Problems
- A Newton-CG Augmented Lagrangian Method for Semidefinite Programming
- A Globally Convergent Augmented Lagrangian Algorithm for Optimization with General Constraints and Simple Bounds
- Two-Metric Projection Methods for Constrained Optimization
- Asymptotic Convergence Analysis of the Proximal Point Algorithm
- A Quadratically Convergent Newton Method for Computing the Nearest Correlation Matrix
- Global Convergence of a a of Trust-Region Methods for Nonconvex Minimization in Hilbert Space
- Implicit Functions and Solution Mappings
- Some continuity properties of polyhedral multifunctions
- Monotone Operators and the Proximal Point Algorithm
- Augmented Lagrangians and Applications of the Proximal Point Algorithm in Convex Programming
- Variational Analysis
- Quadratic Growth Conditions for Convex Matrix Optimization Problems Associated with Spectral Functions
- A Complete Characterization of the Robust Isolated Calmness of Nuclear Norm Regularized Convex Optimization Problems
- Convergence Properties of an Augmented Lagrangian Algorithm for Optimization with a Combination of General Equality and Linear Constraints
- Linear Rate Convergence of the Alternating Direction Method of Multipliers for Convex Composite Programming
- Convex Analysis
- Local Convergence of the Proximal Point Algorithm and Multiplier Methods Without Monotonicity
- Constraint Nondegeneracy in Variational Analysis
- Some Properties of the Augmented Lagrangian in Cone Constrained Optimization
- On the generic properties of convex optimization problems in conic form
- A semismooth Newton-CG based dual PPA for matrix spectral norm approximation problems
- A note on Fejér-monotone sequences in product spaces and its applications to the dual convergence of augmented Lagrangian methods