A block coordinate gradient descent method for regularized convex separable optimization and covariance selection
DOI10.1007/s10107-011-0471-1zbMath1228.90052OpenAlexW2057516887MaRDI QIDQ644904
Paul Tseng, Kim-Chuan Toh, Sangwoon Yun
Publication date: 7 November 2011
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10107-011-0471-1
complexityglobal convergenceconvex optimizationmaximum likelihood estimationcovariance selection\(\ell _{1}\)-penalizationblock coordinate gradient descentlinear rate convergence
Numerical mathematical programming methods (65K05) Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30) Numerical methods based on nonlinear programming (49M37) Decomposition methods (49M27) Methods of successive quadratic programming type (90C55)
Related Items
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Smooth minimization of non-smooth functions
- Sparse inverse covariance estimation with the graphical lasso
- A coordinate gradient descent method for nonsmooth separable minimization
- Convex analysis and nonlinear optimization. Theory and examples
- Efficient estimation of covariance selection models
- Adaptive First-Order Methods for General Sparse Inverse Covariance Selection
- Solving Log-Determinant Optimization Problems by a Newton-CG Primal Proximal Point Algorithm
- First-Order Methods for Sparse Covariance Selection
- Smooth Optimization Approach for Sparse Covariance Selection
- Numerical Optimization
- Covariance selection for nonchordal graphs via chordal embedding
- Convex Analysis
- Convergence of a block coordinate descent method for nondifferentiable minimization