High-dimensional additive modeling
From MaRDI portal
Publication:1043712
DOI10.1214/09-AOS692zbMath1360.62186arXiv0806.4115MaRDI QIDQ1043712
Peter Bühlmann, Sara van de Geer, Lukas Meier
Publication date: 9 December 2009
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/0806.4115
model selection; nonparametric regression; sparsity; group Lasso; oracle inequality; penalized likelihood
62G08: Nonparametric regression and quantile regression
62J07: Ridge regression; shrinkage estimators (Lasso)
Related Items
hgam, Semi-varying coefficient models with a diverging number of components, Sparsity in multiple kernel learning, Generalization of constraints for high dimensional regression problems, Oracle inequalities and optimal inference under group sparsity, Variable selection in nonparametric additive models
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Variable selection in high-dimensional linear models: partially faithful distributions and the PC-simple algorithm
- The Adaptive Lasso and Its Oracle Properties
- Boosting algorithms: regularization, prediction and model fitting
- Component selection and smoothing in multivariate nonparametric regression
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Lasso-type recovery of sparse representations for high-dimensional data
- Relaxed Lasso
- A Bennett concentration inequality and its application to suprema of empirical processes
- Persistene in high-dimensional linear predictor-selection and the virtue of overparametrization
- Nonparametric and semiparametric models.
- Weak convergence and empirical processes. With applications to statistics
- Simultaneous analysis of Lasso and Dantzig selector
- High-dimensional generalized linear models and the lasso
- Sparsity oracle inequalities for the Lasso
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- High-dimensional graphs and variable selection with the Lasso
- Model selection for regression on a random design
- The Group Lasso for Logistic Regression
- Boosting With theL2Loss
- Aggregation and Sparsity Via ℓ1 Penalized Least Squares
- Model Selection and Estimation in Regression with Grouped Variables