High-dimensional additive modeling
From MaRDI portal
Publication:1043712
DOI10.1214/09-AOS692zbMath1360.62186arXiv0806.4115OpenAlexW2093994886MaRDI QIDQ1043712
Lukas Meier, Sara van de Geer, Peter Bühlmann
Publication date: 9 December 2009
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/0806.4115
Nonparametric regression and quantile regression (62G08) Ridge regression; shrinkage estimators (Lasso) (62J07)
Related Items
An introduction to recent advances in high/infinite dimensional statistics ⋮ COBRA: a combined regression strategy ⋮ Statistical inference in sparse high-dimensional additive models ⋮ Interquantile shrinkage and variable selection in quantile regression ⋮ P-splines with an \(\ell_1\) penalty for repeated measures ⋮ Penalized likelihood and Bayesian function selection in regression models ⋮ Fast Bayesian model assessment for nonparametric additive regression ⋮ Monotone splines Lasso ⋮ RCV-based error density estimation in the ultrahigh dimensional additive model ⋮ Extreme eigenvalues of nonlinear correlation matrices with applications to additive models ⋮ SCAD-penalized regression in additive partially linear proportional hazards models with an ultra-high-dimensional linear part ⋮ Model structure selection in single-index-coefficient regression models ⋮ Nonparametric independence screening via favored smoothing bandwidth ⋮ Identification of Partially Linear Structure in Additive Models with an Application to Gene Expression Prediction from Sequences ⋮ Variable selection in partial linear regression with functional covariate ⋮ Nonparametric independence screening for ultra-high-dimensional longitudinal data under additive models ⋮ Fast learning rate of non-sparse multiple kernel learning and optimal regularization strategies ⋮ Regularized estimation for the least absolute relative error models with a diverging number of covariates ⋮ Variable selection for high dimensional Gaussian copula regression model: an adaptive hypothesis testing procedure ⋮ Fast learning rate of multiple kernel learning: trade-off between sparsity and smoothness ⋮ Polynomial spline estimation for generalized varying coefficient partially linear models with a diverging number of components ⋮ A unified penalized method for sparse additive quantile models: an RKHS approach ⋮ Simultaneous confidence bands for sequential autoregressive fitting ⋮ Nonparametric variable selection and its application to additive models ⋮ Correlated variables in regression: clustering and sparse estimation ⋮ Estimation and inference in generalized additive coefficient models for nonlinear interactions with high-dimensional covariates ⋮ Functional additive regression ⋮ A uniform framework for the combination of penalties in generalized structured models ⋮ PAC-Bayesian risk bounds for group-analysis sparse regression by exponential weighting ⋮ Sparsity in multiple kernel learning ⋮ Nonparametric inference for additive models estimated via simplified smooth backfitting ⋮ Non-asymptotic oracle inequalities for the Lasso and Group Lasso in high dimensional logistic model ⋮ On degeneracy and invariances of random fields paths with applications in Gaussian process modelling ⋮ GRID: a variable selection and structure discovery method for high dimensional nonparametric regression ⋮ Nonparametric distributed learning under general designs ⋮ On density and regression estimation with incomplete data ⋮ Consistency of support vector machines using additive kernels for additive models ⋮ Metamodel construction for sensitivity analysis ⋮ Semiparametric regression models with additive nonparametric components and high dimensional parametric components ⋮ Fixed and random effects selection in nonparametric additive mixed models ⋮ A dimension reduction based approach for estimation and variable selection in partially linear single-index models with high-dimensional covariates ⋮ PAC-Bayesian estimation and prediction in sparse additive models ⋮ Dimension reduction and variable selection in case control studies via regularized likelihood optimization ⋮ Transductive versions of the Lasso and the Dantzig selector ⋮ Generalization of constraints for high dimensional regression problems ⋮ Oracle inequalities and optimal inference under group sparsity ⋮ Statistical inference in compound functional models ⋮ Variable selection in infinite-dimensional problems ⋮ On the uniform convergence of empirical norms and inner products, with application to causal inference ⋮ Regression with stagewise minimization on risk function ⋮ Improving the prediction performance of the Lasso by subtracting the additive structural noises ⋮ CAM: causal additive models, high-dimensional order search and penalized regression ⋮ High-dimensional Bayesian inference in nonparametric additive models ⋮ RANK: Large-Scale Inference With Graphical Nonlinear Knockoffs ⋮ Composite quantile regression for ultra-high dimensional semiparametric model averaging ⋮ A continuous analogue of the tensor-train decomposition ⋮ A sequential approach to feature selection in high-dimensional additive models ⋮ Additive model selection ⋮ Oracle inequalities for sparse additive quantile regression in reproducing kernel Hilbert space ⋮ On the \(L_p\) norms of kernel regression estimators for incomplete data with applications to classification ⋮ Empirical Bayes oracle uncertainty quantification for regression ⋮ Learning general sparse additive models from point queries in high dimensions ⋮ Semi-varying coefficient models with a diverging number of components ⋮ Robust feature screening for elliptical copula regression model ⋮ Rank reduction for high-dimensional generalized additive models ⋮ Nonparametric Statistics and High/Infinite Dimensional Data ⋮ Variable selection in nonparametric additive models ⋮ Regularizing Double Machine Learning in Partially Linear Endogenous Models ⋮ Kernel Knockoffs Selection for Nonparametric Additive Models ⋮ Additive models with trend filtering ⋮ Asymptotic properties of concave \(L_1\)-norm group penalties ⋮ Bayesian nonlinear model selection for gene regulatory networks ⋮ Reluctant generalized additive modeling ⋮ Flexible and Interpretable Models for Survival Data ⋮ On histogram-based regression and classification with incomplete data ⋮ Learning non-parametric basis independent models from point queries via low-rank methods ⋮ Feature screening for ultrahigh-dimensional additive logistic models ⋮ A semiparametric model for matrix regression ⋮ Information based complexity for high dimensional sparse functions ⋮ hgam ⋮ Penalized kernel quantile regression for varying coefficient models ⋮ Variable selection in functional regression models: a review ⋮ Gradient-based optimization for regression in the functional tensor-train format ⋮ Nonparametric screening under conditional strictly convex loss for ultrahigh dimensional sparse data ⋮ Sparse nonparametric model for regression with functional covariate ⋮ Sure independence screening in ultrahigh dimensional generalized additive models ⋮ Interpretable machine learning: fundamental principles and 10 grand challenges ⋮ Multidimensional linear functional estimation in sparse Gaussian models and robust estimation of the mean ⋮ Approximate large-scale Bayesian spatial modeling with application to quantitative magnetic resonance imaging ⋮ Nonparametric variable screening for multivariate additive models ⋮ Rates of contraction with respect to \(L_2\)-distance for Bayesian nonparametric regression ⋮ Doubly penalized estimation in additive regression with high-dimensional data ⋮ Partially Linear Structure Selection in Cox Models with Varying Coefficients ⋮ Component Selection in the Additive Regression Model ⋮ Two stage smoothing in additive models with missing covariates ⋮ Structured estimation for the nonparametric Cox model ⋮ Minimax-optimal nonparametric regression in high dimensions ⋮ Nonparametric and high-dimensional functional graphical models ⋮ High dimensional single index models ⋮ Sparse high-dimensional varying coefficient model: nonasymptotic minimax study ⋮ Irrational Exuberance: Correcting Bias in Probability Estimates ⋮ Additive regression splines with total variation and non negative garrote penalties ⋮ Robust sparse functional regression model ⋮ Improvement on LASSO-type estimator in nonparametric regression ⋮ Improved Estimation of High-dimensional Additive Models Using Subspace Learning ⋮ Lag selection in stochastic additive models ⋮ Spectrally Sparse Nonparametric Regression via Elastic Net Regularized Smoothers ⋮ Hierarchical Total Variations and Doubly Penalized ANOVA Modeling for Multivariate Nonparametric Regression ⋮ Unnamed Item ⋮ M-estimation and model identification based on double SCAD penalization ⋮ Multiple Kernel Learningの学習理論 ⋮ Martingale Difference Correlation and Its Use in High-Dimensional Variable Screening ⋮ Grouped variable selection with discrete optimization: computational and statistical perspectives ⋮ Binacox: automatic cut‐point detection in high‐dimensional Cox model with applications in genetics ⋮ Selection of Effects in Cox Frailty Models by Regularization Methods ⋮ Feature selection in ultrahigh-dimensional additive models with heterogeneous frequency component functions ⋮ Unnamed Item ⋮ A reluctant additive model framework for interpretable nonlinear individualized treatment rules ⋮ Multifold Cross-Validation Model Averaging for Generalized Additive Partial Linear Models ⋮ Cross-Fitted Residual Regression for High-Dimensional Heteroscedasticity Pursuit ⋮ Generalized martingale difference divergence: detecting conditional mean independence with applications in variable screening ⋮ Measures of Uncertainty for Shrinkage Model Selection ⋮ Coordinatewise Gaussianization: Theories and Applications ⋮ Sparse additive models in high dimensions with wavelets ⋮ High-dimensional local linear regression under sparsity and convex losses ⋮ Tuning parameters in random forests ⋮ Unnamed Item ⋮ Impact of subsampling and tree depth on random forests ⋮ Refined Generalization Bounds of Gradient Learning over Reproducing Kernel Hilbert Spaces ⋮ Semiparametric model average prediction in panel data analysis ⋮ Functional Horseshoe Priors for Subspace Shrinkage ⋮ Kernel Meets Sieve: Post-Regularization Confidence Bands for Sparse Additive Model ⋮ Spike-and-Slab Priors for Function Selection in Structured Additive Regression Models ⋮ Estimation by polynomial splines with variable selection in additive Cox models ⋮ Asymptotics for penalised splines in generalised additive models ⋮ Error Variance Estimation in Ultrahigh-Dimensional Additive Models ⋮ Bayesian quantile regression for partially linear additive models ⋮ A selective review of group selection in high-dimensional models ⋮ A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers ⋮ Rejoinder ⋮ Feature screening in ultrahigh-dimensional additive Cox model ⋮ Semiparametric Ultra-High Dimensional Model Averaging of Nonlinear Dynamic Time Series ⋮ Sparse Additive Ordinary Differential Equations for Dynamic Gene Regulatory Network Modeling ⋮ Partially Linear Functional Additive Models for Multivariate Functional Data ⋮ Testing for additivity in non‐parametric regression ⋮ Univariate measurement error selection likelihood for variable selection of additive model
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Variable selection in high-dimensional linear models: partially faithful distributions and the PC-simple algorithm
- The Adaptive Lasso and Its Oracle Properties
- Boosting algorithms: regularization, prediction and model fitting
- Component selection and smoothing in multivariate nonparametric regression
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Lasso-type recovery of sparse representations for high-dimensional data
- Relaxed Lasso
- A Bennett concentration inequality and its application to suprema of empirical processes
- Persistene in high-dimensional linear predictor-selection and the virtue of overparametrization
- Nonparametric and semiparametric models.
- Weak convergence and empirical processes. With applications to statistics
- Simultaneous analysis of Lasso and Dantzig selector
- High-dimensional generalized linear models and the lasso
- Sparsity oracle inequalities for the Lasso
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- High-dimensional graphs and variable selection with the Lasso
- Model selection for regression on a random design
- The Group Lasso for Logistic Regression
- Boosting With theL2Loss
- Aggregation and Sparsity Via ℓ1 Penalized Least Squares
- Model Selection and Estimation in Regression with Grouped Variables