High-dimensional Gaussian model selection on a Gaussian design
DOI10.1214/09-AIHP321zbMATH Open1191.62076arXiv0808.2152OpenAlexW2963498088MaRDI QIDQ985331FDOQ985331
Publication date: 21 July 2010
Published in: Annales de l'Institut Henri Poincaré. Probabilités et Statistiques (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/0808.2152
model selectionGaussian graphical modelslinear regressionoracle inequalitiesminimax rates of estimation
Nonparametric regression and quantile regression (62G08) Linear regression; mixed models (62J05) Inequalities; stochastic orderings (60E15)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Gaussian Markov Random Fields
- Estimating the dimension of a model
- The Adaptive Lasso and Its Oracle Properties
- Some Comments on C P
- A new look at the statistical model identification
- Simultaneous analysis of Lasso and Dantzig selector
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- High-dimensional graphs and variable selection with the Lasso
- Learning Theory and Kernel Machines
- Tests for Gaussian graphical models
- Gaussian model selection with an unknown variance
- Sparsity oracle inequalities for the Lasso
- Gaussian model selection
- Local operator theory, random matrices and Banach spaces.
- Decoding by Linear Programming
- Adaptive estimation of a quadratic functional by model selection.
- Estimation of Gaussian graphs by model selection
- Model selection by resampling penalization
- Aggregation for Gaussian regression
- Minimum contrast estimators on sieves: Exponential bounds and rates of convergence
- Probabilistic Networks and Expert Systems
- Minimal penalties for Gaussian model selection
- Concentration inequalities and model selection. Ecole d'Eté de Probabilités de Saint-Flour XXXIII -- 2003.
- An optimal selection of regression variables
- Statistical predictor identification
- Near-ideal model selection by \(\ell _{1}\) minimization
- Information-Theoretic Limits on Sparsity Recovery in the High-Dimensional and Noisy Setting
- Power-law correlations, related models for long-range dependence and their simulation
- A New Lower Bound for Multiple Hypothesis Testing
Cited In (9)
- Penalized contrast estimation in functional linear models with circular data
- Title not available (Why is that?)
- Adaptive estimation of covariance matrices via Cholesky decomposition
- Minimax risks for sparse regressions: ultra-high dimensional phenomenons
- Adaptive estimation of linear functionals in functional linear models
- Model selection for Gaussian regression with random design
- Adaptive functional linear regression
- Gaussian model selection
- High-dimensional regression with unknown variance
Uses Software
This page was built for publication: High-dimensional Gaussian model selection on a Gaussian design
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q985331)