Discussion: Latent variable graphical model selection via convex optimization
From MaRDI portal
Publication:5970647
DOI10.1214/12-AOS985zbMath1288.62090arXiv1211.0813MaRDI QIDQ5970647
Publication date: 7 March 2013
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1211.0813
Asymptotic properties of parametric estimators (62F12) Estimation in multivariate analysis (62H12) Applications of graph theory (05C90) Convex programming (90C25)
Related Items (3)
Learning Gaussian graphical models with latent confounders ⋮ Estimating structured high-dimensional covariance and precision matrices: optimal rates and adaptive estimation ⋮ Asymptotic normality and optimalities in estimation of large Gaussian graphical models
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A well-conditioned estimator for large-dimensional covariance matrices
- High dimensional covariance matrix estimation using a factor model
- Covariance regularization by thresholding
- Operator norm consistent estimation of large-dimensional sparse covariance matrices
- Sparsistency and rates of convergence in large covariance matrix estimation
- Gaussian Markov distributions over finite graphs
- Characterization of the subdifferential of some matrix norms
- On the distribution of the largest eigenvalue in principal components analysis
- Rejoinder: Latent variable graphical model selection via convex optimization
- Sparse permutation invariant covariance estimation
- High-dimensional covariance estimation by minimizing \(\ell _{1}\)-penalized log-determinant divergence
- Regularized estimation of large covariance matrices
- High-dimensional graphs and variable selection with the Lasso
- Exact matrix completion via convex optimization
- Nonparametric estimation of large covariance matrices of longitudinal data
- Robust principal component analysis?
- Solving Log-Determinant Optimization Problems by a Newton-CG Primal Proximal Point Algorithm
- Rank-Sparsity Incoherence for Matrix Decomposition
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization
- SDPT3 — A Matlab software package for semidefinite programming, Version 1.3
- Equivalent Subgradient Versions of Hamiltonian and Euler–Lagrange Equations in Variational Analysis
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- For most large underdetermined systems of linear equations the minimal 𝓁1‐norm solution is also the sparsest solution
- Compressed sensing
This page was built for publication: Discussion: Latent variable graphical model selection via convex optimization