A block coordinate gradient descent method for regularized convex separable optimization and covariance selection
DOI10.1007/S10107-011-0471-1zbMATH Open1228.90052OpenAlexW2057516887MaRDI QIDQ644904FDOQ644904
Authors: Sangwoon Yun, Paul Tseng, Kim-Chuan Toh
Publication date: 7 November 2011
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10107-011-0471-1
Recommendations
- A coordinate gradient descent method for nonsmooth separable minimization
- Block coordinate proximal gradient methods with variable Bregman functions for nonsmooth separable optimization
- On the convergence of block coordinate descent type methods
- Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization
- A coordinate gradient descent method for \(\ell_{1}\)-regularized convex minimization
maximum likelihood estimationconvex optimizationcovariance selectioncomplexityglobal convergence\(\ell _{1}\)-penalizationblock coordinate gradient descentlinear rate convergence
Numerical mathematical programming methods (65K05) Convex programming (90C25) Methods of successive quadratic programming type (90C55) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30) Numerical methods based on nonlinear programming (49M37) Decomposition methods (49M27)
Cites Work
- Numerical Optimization
- Convergence of a block coordinate descent method for nondifferentiable minimization
- Smooth minimization of non-smooth functions
- Sparse inverse covariance estimation with the graphical lasso
- Title not available (Why is that?)
- First-Order Methods for Sparse Covariance Selection
- Convex Analysis
- A coordinate gradient descent method for nonsmooth separable minimization
- Solving log-determinant optimization problems by a Newton-CG primal proximal point algorithm
- Convex analysis and nonlinear optimization. Theory and examples
- Title not available (Why is that?)
- Adaptive First-Order Methods for General Sparse Inverse Covariance Selection
- Smooth Optimization Approach for Sparse Covariance Selection
- Covariance selection for nonchordal graphs via chordal embedding
- Efficient estimation of covariance selection models
- Local strong convexity and local Lipschitz continuity of the gradient of convex functions
Cited In (12)
- Variable selection with group Lasso approach: application to Cox regression with frailty model
- SymNMF: nonnegative low-rank approximation of a similarity matrix for graph clustering
- A half-quadratic block-coordinate descent method for spectral estimation.
- Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization
- A coordinate gradient descent method for \(\ell_{1}\)-regularized convex minimization
- On the convergence of block coordinate descent type methods
- Dimensionality reduction and variable selection in multivariate varying-coefficient models with a large number of covariates
- Block coordinate proximal gradient methods with variable Bregman functions for nonsmooth separable optimization
- Title not available (Why is that?)
- Improved Estimation of High-dimensional Additive Models Using Subspace Learning
- A coordinate gradient descent method for nonsmooth separable minimization
- Numerical experiments on stochastic block proximal-gradient type method for convex constrained optimization involving coordinatewise separable problems
Uses Software
This page was built for publication: A block coordinate gradient descent method for regularized convex separable optimization and covariance selection
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q644904)