On how to solve large-scale log-determinant optimization problems
From MaRDI portal
Publication:288409
DOI10.1007/S10589-015-9812-YzbMATH Open1350.90028OpenAlexW2132909497MaRDI QIDQ288409FDOQ288409
Publication date: 25 May 2016
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10589-015-9812-y
quadratic programmingaugmented Lagrangian methodlog-determinant optimization problemNewton-CG methodproximal augmented Lagrangian method
Cites Work
- Title not available (Why is that?)
- Solving semidefinite-quadratic-linear programs using SDPT3
- Matrix Analysis
- A Newton-CG Augmented Lagrangian Method for Semidefinite Programming
- A Quadratically Convergent Newton Method for Computing the Nearest Correlation Matrix
- First-Order Methods for Sparse Covariance Selection
- Convex Analysis
- Hankel matrix rank minimization with applications to system identification and realization
- Solving Log-Determinant Optimization Problems by a Newton-CG Primal Proximal Point Algorithm
- Alternating direction method for covariance selection models
- Title not available (Why is that?)
- Monotone Operators and the Proximal Point Algorithm
- Augmented Lagrangians and Applications of the Proximal Point Algorithm in Convex Programming
- Title not available (Why is that?)
- An Inexact Accelerated Proximal Gradient Method for Large Scale Linearly Constrained Convex SDP
- Proximité et dualité dans un espace hilbertien
- Title not available (Why is that?)
- An inexact interior point method for \(L_{1}\)-regularized sparse covariance selection
- Complementarity and nondegeneracy in semidefinite programming
- Primal-dual path-following algorithms for determinant maximization problems with linear matrix inequalities
- Semismoothness of solutions to generalized equations and the Moreau-Yosida regularization
- On the monotonicity of the gradient of a convex function
- A proximal point algorithm for log-determinant optimization with group Lasso regularization
- Adaptive First-Order Methods for General Sparse Inverse Covariance Selection
- Calibrating Least Squares Semidefinite Programming with Equality and Inequality Constraints
- Smooth Optimization Approach for Sparse Covariance Selection
- A dual approach to solving nonlinear programming problems by unconstrained optimization
- Fused Multiple Graphical Lasso
- The Strong Second-Order Sufficient Condition and Constraint Nondegeneracy in Nonlinear Semidefinite Programming and Their Implications
- Covariance selection for nonchordal graphs via chordal embedding
- An inexact primal-dual path following algorithm for convex quadratic SDP
Cited In (3)
- A dual spectral projected gradient method for log-determinant semidefinite problems
- An interior point sequential quadratic programming-type method for log-determinant semi-infinite programs
- A primal majorized semismooth Newton-CG augmented Lagrangian method for large-scale linearly constrained convex programming
Uses Software
This page was built for publication: On how to solve large-scale log-determinant optimization problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q288409)