A superlinearly convergent splitting feasible sequential quadratic optimization method for two-block large-scale smooth optimization
DOI10.1007/s10473-023-0101-zOpenAlexW4284961056MaRDI QIDQ2088141
Pengjie Liu, Chen Zhang, Jin-Bao Jian
Publication date: 21 October 2022
Published in: Acta Mathematica Scientia. Series B. (English Edition) (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10473-023-0101-z
superlinear convergencesplitting methodlarge scale optimizationfeasible sequential quadratic optimization methodtwo-block smooth optimization
Numerical mathematical programming methods (65K05) Nonconvex programming, global optimization (90C26) Nonlinear programming (90C30) Decomposition methods (49M27)
Related Items (2)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
- Fast alternating linearization methods for minimizing the sum of two convex functions
- Monotone splitting sequential quadratic optimization algorithm with applications in electric power systems
- A superlinearly convergent implicit smooth SQP algorithm for mathematical programs with nonlinear complementarity constraints
- An efficient feasible SQP algorithm for inequality constrained optimization
- Global convergence of an SQP method without boundedness assumptions on any of the iterative sequences
- Ergodic convergence to a zero of the sum of monotone operators in Hilbert space
- A globally convergent QP-free algorithm for inequality constrained minimax optimization
- A QCQP-based splitting SQP algorithm for two-block nonconvex constrained optimization problems with application
- On non-ergodic convergence rate of Douglas-Rachford alternating direction method of multipliers
- Regularized Jacobi-type ADMM-methods for a class of separable convex optimization problems in Hilbert spaces
- On the $O(1/n)$ Convergence Rate of the Douglas–Rachford Alternating Direction Method
- A Strictly Contractive Peaceman--Rachford Splitting Method for Convex Programming
- Convergence Analysis of Alternating Direction Method of Multipliers for a Family of Nonconvex Problems
- Convergence Study on the Symmetric Version of ADMM with Larger Step Sizes
- On the Numerical Solution of Heat Conduction Problems in Two and Three Space Variables
- Global Convergence of Splitting Methods for Nonconvex Composite Optimization
- A New Superlinearly Convergent Strongly Subfeasible Sequential Quadratic Programming Algorithm for Inequality-Constrained Optimization
- A successive quadratic programming algorithm with global and superlinear convergence properties
- A Superlinearly Convergent Feasible Method for the Solution of Inequality Constrained Optimization Problems
- Splitting Algorithms for the Sum of Two Nonlinear Operators
- Rate of Convergence Analysis of Decomposition Methods Based on the Proximal Method of Multipliers for Convex Minimization
- Fast Alternating Direction Optimization Methods
- Linear Rate Convergence of the Alternating Direction Method of Multipliers for Convex Composite Programming
- Iteration-Complexity of Block-Decomposition Algorithms and the Alternating Direction Method of Multipliers
- Object-oriented software for quadratic programming
- A Nonlinear Alternating Direction Method
- A quadratically-convergent algorithm for general nonlinear programming problems
This page was built for publication: A superlinearly convergent splitting feasible sequential quadratic optimization method for two-block large-scale smooth optimization