Sparse Additive Models

From MaRDI portal
Publication:4632616


DOI10.1111/j.1467-9868.2009.00718.xzbMath1411.62107arXiv0711.4555MaRDI QIDQ4632616

Pradeep Ravikumar, Han Liu, John D. Lafferty, Larry Alan Wasserman

Publication date: 30 April 2019

Published in: Journal of the Royal Statistical Society Series B: Statistical Methodology (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/0711.4555


62G08: Nonparametric regression and quantile regression

62H30: Classification and discrimination; cluster analysis (statistical aspects)

62G05: Nonparametric estimation


Related Items

Nonparametric independence screening for ultra-high-dimensional longitudinal data under additive models, Bayesian Neural Networks for Selection of Drug Sensitive Genes, Metamodel construction for sensitivity analysis, Error Variance Estimation in Ultrahigh-Dimensional Additive Models, Unnamed Item, Unnamed Item, Feature screening in ultrahigh-dimensional additive Cox model, Bayesian Regression Trees for High-Dimensional Prediction and Variable Selection, Sparse additive machine with ramp loss, Univariate measurement error selection likelihood for variable selection of additive model, Additive regression splines with total variation and non negative garrote penalties, Improved Estimation of High-dimensional Additive Models Using Subspace Learning, Hierarchical Total Variations and Doubly Penalized ANOVA Modeling for Multivariate Nonparametric Regression, Unnamed Item, Unnamed Item, Bayesian Model Selection in Additive Partial Linear Models Via Locally Adaptive Splines, Sparse group lasso for multiclass functional logistic regression models, Sure independence screening for analyzing supersaturated designs, Asymptotic Theory of \(\boldsymbol \ell _1\) -Regularized PDE Identification from a Single Noisy Trajectory, Ultrahigh dimensional feature screening for additive model with multivariate response, Functional Horseshoe Priors for Subspace Shrinkage, Kernel Meets Sieve: Post-Regularization Confidence Bands for Sparse Additive Model, Global sensitivity analysis with dependence measures, Bayesian Additive Machine: classification with a semiparametric discriminant function, High-Dimensional Feature Selection by Feature-Wise Kernelized Lasso, Partially Linear Functional Additive Models for Multivariate Functional Data, Refined Generalization Bounds of Gradient Learning over Reproducing Kernel Hilbert Spaces, Spike-and-Slab Group Lassos for Grouped Regression and Sparse Generalized Additive Models, Group descent algorithms for nonconvex penalized linear and logistic regression models with grouped predictors, Bayesian quantile regression for partially linear additive models, Generalized varying index coefficient models, AdaBoost Semiparametric Model Averaging Prediction for Multiple Categories, Unnamed Item, A Model-free Variable Screening Method Based on Leverage Score, Frequentist Model Averaging for the Nonparametric Additive Model, Grouped variable selection with discrete optimization: computational and statistical perspectives, Estimation of nonparanormal graphical models based on ranked set sampling (RSS), Choosing shape parameters for regression in reproducing kernel Hilbert space and variable selection, Nonparametric Functional Graphical Modeling Through Functional Additive Regression Operator, Functional additive models for optimizing individualized treatment rules, Variable Selection Via Thompson Sampling, A sparse additive model for high-dimensional interactions with an exposure variable, Distribution-Free Predictive Inference For Regression, Kernel Knockoffs Selection for Nonparametric Additive Models, A simple measure of conditional dependence, A Projection Based Conditional Dependence Measure with Applications to High-dimensional Undirected Graphical Models, Flexible and Interpretable Models for Survival Data, Multi-Resolution Functional ANOVA for Large-Scale, Many-Input Computer Experiments, Profiled adaptive elastic-net procedure for partially linear models with high-dimensional covar\-i\-ates, Semiparametric regression models with additive nonparametric components and high dimensional parametric components, Stochastic compositional gradient descent: algorithms for minimizing compositions of expected-value functions, Additive model selection, Autoregressive process modeling via the Lasso procedure, Continuously dynamic additive models for functional data, Learning non-parametric basis independent models from point queries via low-rank methods, Statistical inference in sparse high-dimensional additive models, Variable selection in multivariate linear models for functional data via sparse regularization, A selective overview of feature screening for ultrahigh-dimensional data, Bias-corrected inference for multivariate nonparametric regression: model selection and oracle property, Interquantile shrinkage and variable selection in quantile regression, P-splines with an \(\ell_1\) penalty for repeated measures, Penalized likelihood and Bayesian function selection in regression models, Fast Bayesian model assessment for nonparametric additive regression, Monotone splines Lasso, Logitboost autoregressive networks, Structured variable selection via prior-induced hierarchical penalty functions, Variable selection for high dimensional Gaussian copula regression model: an adaptive hypothesis testing procedure, Error analysis for coefficient-based regularized regression in additive models, Regression with stagewise minimization on risk function, Improving the prediction performance of the Lasso by subtracting the additive structural noises, Variable selection of high-dimensional non-parametric nonlinear systems by derivative averaging to avoid the curse of dimensionality, Testing if a nonlinear system is additive or not, Oracle inequalities for sparse additive quantile regression in reproducing kernel Hilbert space, Feature screening for nonparametric and semiparametric models with ultrahigh-dimensional covariates, Empirical Bayes oracle uncertainty quantification for regression, Learning general sparse additive models from point queries in high dimensions, Penalized kernel quantile regression for varying coefficient models, Interpretable machine learning: fundamental principles and 10 grand challenges, Nonparametric variable screening for multivariate additive models, Generalization bounds for sparse random feature expansions, Nonparametric and high-dimensional functional graphical models, RCV-based error density estimation in the ultrahigh dimensional additive model, GRID: a variable selection and structure discovery method for high dimensional nonparametric regression, Consistent group selection with Bayesian high dimensional modeling, Estimation of undirected graph with finite mixture of nonparanormal distribution, A sequential approach to feature selection in high-dimensional additive models, Variable selection for fixed effects varying coefficient models, Rank reduction for high-dimensional generalized additive models, Additive models with trend filtering, Sure independence screening in ultrahigh dimensional generalized additive models, Rates of contraction with respect to \(L_2\)-distance for Bayesian nonparametric regression, Doubly penalized estimation in additive regression with high-dimensional data, Two stage smoothing in additive models with missing covariates, High dimensional single index models, A unified penalized method for sparse additive quantile models: an RKHS approach, PAC-Bayesian risk bounds for group-analysis sparse regression by exponential weighting, Copulas in Machine Learning, RANK: Large-Scale Inference With Graphical Nonlinear Knockoffs, ABC–CDE: Toward Approximate Bayesian Computation With Complex High-Dimensional Data and Limited Simulations, Semiparametric model averaging method for survival probability predictions of patients, A reluctant additive model framework for interpretable nonlinear individualized treatment rules, Latent Network Structure Learning From High-Dimensional Multivariate Point Processes, Bayesian pathway selection, Measures of Uncertainty for Shrinkage Model Selection, Sparse additive models in high dimensions with wavelets, High-dimensional local linear regression under sparsity and convex losses