A block coordinate gradient descent method for regularized convex separable optimization and covariance selection
maximum likelihood estimationconvex optimizationcovariance selectioncomplexityglobal convergence\(\ell _{1}\)-penalizationblock coordinate gradient descentlinear rate convergence
Numerical mathematical programming methods (65K05) Convex programming (90C25) Methods of successive quadratic programming type (90C55) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30) Numerical methods based on nonlinear programming (49M37) Decomposition methods (49M27)
- A coordinate gradient descent method for nonsmooth separable minimization
- Block coordinate proximal gradient methods with variable Bregman functions for nonsmooth separable optimization
- On the convergence of block coordinate descent type methods
- Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization
- A coordinate gradient descent method for \(\ell_{1}\)-regularized convex minimization
- scientific article; zbMATH DE number 3850830 (Why is no real title available?)
- scientific article; zbMATH DE number 1953446 (Why is no real title available?)
- A coordinate gradient descent method for nonsmooth separable minimization
- Adaptive First-Order Methods for General Sparse Inverse Covariance Selection
- Convergence of a block coordinate descent method for nondifferentiable minimization
- Convex Analysis
- Convex analysis and nonlinear optimization. Theory and examples
- Covariance selection for nonchordal graphs via chordal embedding
- Efficient estimation of covariance selection models
- First-Order Methods for Sparse Covariance Selection
- Local strong convexity and local Lipschitz continuity of the gradient of convex functions
- Numerical Optimization
- Smooth Optimization Approach for Sparse Covariance Selection
- Smooth minimization of non-smooth functions
- Solving log-determinant optimization problems by a Newton-CG primal proximal point algorithm
- Sparse inverse covariance estimation with the graphical lasso
- A coordinate gradient descent method for nonsmooth separable minimization
- Block coordinate proximal gradient methods with variable Bregman functions for nonsmooth separable optimization
- Numerical experiments on stochastic block proximal-gradient type method for convex constrained optimization involving coordinatewise separable problems
- Variable selection with group Lasso approach: application to Cox regression with frailty model
- A coordinate gradient descent method for \(\ell_{1}\)-regularized convex minimization
- Dimensionality reduction and variable selection in multivariate varying-coefficient models with a large number of covariates
- Improved Estimation of High-dimensional Additive Models Using Subspace Learning
- Smooth Optimization Approach for Sparse Covariance Selection
- SymNMF: nonnegative low-rank approximation of a similarity matrix for graph clustering
- On the convergence of block coordinate descent type methods
- A half-quadratic block-coordinate descent method for spectral estimation.
- scientific article; zbMATH DE number 7415090 (Why is no real title available?)
- Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization
This page was built for publication: A block coordinate gradient descent method for regularized convex separable optimization and covariance selection
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q644904)