High-dimensional Gaussian model selection on a Gaussian design
From MaRDI portal
Publication:985331
Abstract: We consider the problem of estimating the conditional mean of a real Gaussian variable where the vector of the covariates follows a joint Gaussian distribution. This issue often occurs when one aims at estimating the graph or the distribution of a Gaussian graphical model. We introduce a general model selection procedure which is based on the minimization of a penalized least-squares type criterion. It handles a variety of problems such as ordered and complete variable selection, allows to incorporate some prior knowledge on the model and applies when the number of covariates is larger than the number of observations . Moreover, it is shown to achieve a non-asymptotic oracle inequality independently of the correlation structure of the covariates. We also exhibit various minimax rates of estimation in the considered framework and hence derive adaptiveness properties of our procedure.
Recommendations
Cites work
- scientific article; zbMATH DE number 5957408 (Why is no real title available?)
- scientific article; zbMATH DE number 4084766 (Why is no real title available?)
- scientific article; zbMATH DE number 52492 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- scientific article; zbMATH DE number 6791298 (Why is no real title available?)
- A New Lower Bound for Multiple Hypothesis Testing
- A new look at the statistical model identification
- Adaptive estimation of a quadratic functional by model selection.
- Aggregation for Gaussian regression
- An optimal selection of regression variables
- Concentration inequalities and model selection. Ecole d'Eté de Probabilités de Saint-Flour XXXIII -- 2003.
- Decoding by Linear Programming
- Estimating high-dimensional directed acyclic graphs with the PC-algorithm
- Estimating the dimension of a model
- Estimation of Gaussian graphs by model selection
- Gaussian Markov Random Fields
- Gaussian model selection
- Gaussian model selection with an unknown variance
- High-dimensional graphs and variable selection with the Lasso
- Information-Theoretic Limits on Sparsity Recovery in the High-Dimensional and Noisy Setting
- Learning Theory and Kernel Machines
- Local operator theory, random matrices and Banach spaces.
- Minimal penalties for Gaussian model selection
- Minimum contrast estimators on sieves: Exponential bounds and rates of convergence
- Model selection by resampling penalization
- Near-ideal model selection by \(\ell _{1}\) minimization
- Power-law correlations, related models for long-range dependence and their simulation
- Probabilistic Networks and Expert Systems
- Simultaneous analysis of Lasso and Dantzig selector
- Some Comments on C P
- Sparsity oracle inequalities for the Lasso
- Statistical predictor identification
- Tests for Gaussian graphical models
- The Adaptive Lasso and Its Oracle Properties
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
Cited in
(12)- Adaptive estimation of linear functionals in functional linear models
- MAP model selection in Gaussian regression
- Penalized contrast estimation in functional linear models with circular data
- Gaussian model selection
- High-dimensional regression with unknown variance
- Adaptive estimation of covariance matrices via Cholesky decomposition
- Goodness-of-fit tests for high-dimensional Gaussian linear models
- scientific article; zbMATH DE number 6443076 (Why is no real title available?)
- Estimation of Gaussian graphs by model selection
- Adaptive functional linear regression
- Model selection for Gaussian regression with random design
- Minimax risks for sparse regressions: ultra-high dimensional phenomenons
This page was built for publication: High-dimensional Gaussian model selection on a Gaussian design
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q985331)