Correlative sparsity in primal-dual interior-point methods for LP, SDP, and SOCP
From MaRDI portal
Publication:1024721
DOI10.1007/s00245-007-9030-9zbMath1191.90032OpenAlexW1971164758MaRDI QIDQ1024721
Kazuhiro Kobayashi, Sunyoung Kim, Kojima, Masakazu
Publication date: 17 June 2009
Published in: Applied Mathematics and Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s00245-007-9030-9
chordal graphlinear programpartial separabilitysemidefinite programprimal-dual interior-point methodsecond-order cone programcorrelative sparsity
Related Items
Solving polynomial least squares problems via semidefinite programming relaxations, Doubly nonnegative relaxations are equivalent to completely positive reformulations of quadratic optimization problems with block-clique graph structures, Implementation of nonsymmetric interior-point methods for linear optimization over sparse matrix cones, Enclosing ellipsoids and elliptic cylinders of semialgebraic sets and their application to error bounds in polynomial optimization, An Iterative Scheme for Valid Polynomial Inequality Generation in Binary Polynomial Programming, Exploiting sparsity in linear and nonlinear matrix inequalities via positive semidefinite matrix completion, Exploiting Sparsity in SDP Relaxation of Polynomial Optimization Problems, Latest Developments in the SDPA Family for Solving Large-Scale SDPs, Sparse semidefinite programs with guaranteed near-linear time complexity via dualized clique tree conversion, Bregman primal-dual first-order method and application to sparse semidefinite programming
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A note on sparse SOS and SDP relaxations for polynomial optimization problems over symmetric cones
- Recognizing underlying sparsity in optimization
- Exploiting sparsity in primal-dual interior-point methods for semidefinite programming
- Exploiting sparsity in semidefinite programming via matrix completion. II: Implementation and numerical results
- Sparsity in sums of squares of polynomials
- Product-form Cholesky factorization in interior point methods for second-order cone programming
- An efficient trust region method for unconstrained discrete-time optimal control problems
- An implementation of Karmarkar's algorithm for linear programming
- Exploiting Sparsity in Semidefinite Programming via Matrix Completion I: General Framework
- Global Optimization with Polynomials and the Problem of Moments
- Solving Some Large Scale Semidefinite Programs via the Conjugate Residual Method
- Interior-Point Methods for the Monotone Semidefinite Linear Complementarity Problem in Symmetric Matrices
- Newton-Type Minimization via the Lanczos Method
- A Fast Algorithm for Reordering Sparse Matrices for Parallel Factorization
- Testing a Class of Methods for Solving Minimization Problems with Simple Bounds on the Variables
- Testing Unconstrained Optimization Software
- A modified Schur-complement method for handling dense columns in interior-point methods for linear programming
- Primal--Dual Path-Following Algorithms for Semidefinite Programming
- Primal-Dual Interior-Point Methods for Self-Scaled Cones
- Implementation of interior point methods for mixed semidefinite and second order cone optimization problems
- A convergence analysis of the scaling-invariant primal-dual path-following algorithms for second-order cone programming
- Solving Large Scale Semidefinite Programs via an Iterative Solver on the Augmented Systems
- Symmetric Quasidefinite Matrices
- An Interior-Point Method for Semidefinite Programming
- Generalized Lagrangian Duals and Sums of Squares Relaxations of Sparse Polynomial Optimization Problems
- Sums of Squares and Semidefinite Program Relaxations for Polynomial Optimization Problems with Structured Sparsity