Bayesian bandwidth test and selection for high-dimensional banded precision matrices
From MaRDI portal
Publication:2226705
Abstract: Assuming a banded structure is one of the common practice in the estimation of high-dimensional precision matrix. In this case, estimating the bandwidth of the precision matrix is a crucial initial step for subsequent analysis. Although there exist some consistent frequentist tests for the bandwidth parameter, bandwidth selection consistency for precision matrices has not been established in a Bayesian framework. In this paper, we propose a prior distribution tailored to the bandwidth estimation of high-dimensional precision matrices. The banded structure is imposed via the Cholesky factor from the modified Cholesky decomposition. We establish the strong model selection consistency for the bandwidth as well as the consistency of the Bayes factor. The convergence rates for Bayes factors under both the null and alternative hypotheses are derived which yield similar order of rates. As a by-product, we also proposed an estimation procedure for the Cholesky factors yielding an almost optimal order of convergence rates. Two-sample bandwidth test is also considered, and it turns out that our method is able to consistently detect the equality of bandwidths between two precision matrices. The simulation study confirms that our method in general outperforms the existing frequentist and Bayesian methods.
Recommendations
- Test for bandedness of high-dimensional precision matrices
- Forward adaptive banding for estimating large covariance matrices
- Test for bandedness of high-dimensional covariance matrices and bandwidth estimation
- Hypothesis testing for band size detection of high-dimensional banded precision matrices
- Estimating large precision matrices via modified Cholesky decomposition
Cites work
- A note on the consistency of Bayes factors for testing point null versus nonparametric alternatives.
- Asymptotic normality and optimalities in estimation of large Gaussian graphical models
- Bayes factor consistency for nested linear models with a growing number of parameters
- Bayesian linear regression with sparse priors
- Bayesian model selection in high-dimensional settings
- Bayesian structure learning in graphical models
- Cholesky decomposition of a hyper inverse Wishart matrix
- Consistency of Bayes factor for nonnested model selection when the model dimension grows
- Consistency of Bayesian linear model selection with a growing number of parameters
- Consistency of objective Bayes factors as the model dimension grows
- Covariance matrix selection and estimation via penalised normal likelihood
- Empirical Bayes posterior concentration in sparse high-dimensional linear models
- High dimensional covariance matrix estimation using a factor model
- High dimensional posterior convergence rates for decomposable graphical models
- Hypothesis testing for band size detection of high-dimensional banded precision matrices
- Minimax estimation of large covariance matrices under \(\ell_1\)-norm
- Minimax posterior convergence rates and model selection consistency in high-dimensional DAG models based on sparse Cholesky factors
- Mixtures of g Priors for Bayesian Variable Selection
- On consistency and sparsity for principal components analysis in high dimensions
- On the computational complexity of high-dimensional Bayesian variable selection
- On the use of Non-Local Prior Densities in Bayesian Hypothesis Tests
- Optimal Bayesian minimax rates for unconstrained large covariance matrices
- Optimal estimation and rank detection for sparse spiked covariance matrices
- Optimal rates of convergence for covariance matrix estimation
- Posterior contraction in sparse Bayesian factor models for massive covariance matrices
- Posterior convergence rates for estimating large precision matrices using graphical models
- Posterior graph selection and estimation consistency for high-dimensional Bayesian DAG models
- Rate-optimal posterior contraction for sparse PCA
- Regularized estimation of large covariance matrices
- Scalable Bayesian variable selection using nonlocal prior densities in ultrahigh-dimensional settings
- Test for bandedness of high-dimensional precision matrices
- Tractable Bayesian variable selection: beyond normality
- Two-Sample Covariance Matrix Testing and Support Recovery in High-Dimensional and Sparse Settings
Cited in
(7)- Hypothesis testing for band size detection of high-dimensional banded precision matrices
- Test for bandedness of high-dimensional covariance matrices and bandwidth estimation
- Estimating large precision matrices via modified Cholesky decomposition
- Connection between the selection problem for a sparse submatrix of a large-size matrix and the Bayesian problem of hypotheses testing
- Test for bandedness of high-dimensional precision matrices
- Bandwidth selection for high-dimensional covariance matrix estimation
- Scalable Bayesian high-dimensional local dependence learning
This page was built for publication: Bayesian bandwidth test and selection for high-dimensional banded precision matrices
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2226705)