Post-processed posteriors for sparse covariances
From MaRDI portal
Publication:6133369
Cites work
- A CENTRAL LIMIT THEOREM AND A STRONG MIXING CONDITION
- A transformation approach for incorporating monotone or unimodal constraints
- Bayesian graphical Lasso models and efficient posterior computation
- Bayesian inference on order-constrained parameters in generalized linear models
- Bayesian monotone regression using Gaussian process projection
- Bayesian regularization for graphical models with unequal shrinkage
- Bayesian structure learning in graphical models
- Covariance regularization by thresholding
- DISTRIBUTION OF EIGENVALUES FOR SOME SETS OF RANDOM MATRICES
- Eigenvalue ratio test for the number of factors
- Estimating sparse precision matrix: optimal rates of convergence and adaptive estimation
- Estimation of conditional mean operator under the bandable covariance structure
- Generalized thresholding of large covariance matrices
- Large covariance estimation by thresholding principal orthogonal complements. With discussion and authors' reply
- Optimal rates of convergence for sparse covariance matrix estimation
- Regularized estimation of large covariance matrices
- Robust high-dimensional factor models with applications to statistical machine learning
- Sparse Bayesian infinite factor models
- Sparse precision matrix estimation via lasso penalized D-trace loss
- The beta-mixture shrinkage prior for sparse covariances with near-minimax posterior convergence rate
- The graphical lasso: new insights and alternatives
This page was built for publication: Post-processed posteriors for sparse covariances
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6133369)