Cited in
(81)- Tuning parameters in random forests
- Semiparametric model average prediction in panel data analysis
- A unified penalized method for sparse additive quantile models: an RKHS approach
- Nonparametric Statistics and High/Infinite Dimensional Data
- Lag selection in stochastic additive models
- Rejoinder
- scientific article; zbMATH DE number 6670732 (Why is no real title available?)
- Buckley-James boosting for survival analysis with high-dimensional biomarker data
- Metamodel construction for sensitivity analysis
- Partially linear structure selection in Cox models with varying coefficients
- Component selection in the additive regression model
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
- Identification of partially linear structure in additive models with an application to gene expression prediction from sequences
- Standardization and the group lasso penalty
- Regularized estimation for the least absolute relative error models with a diverging number of covariates
- Variable selection for high-dimensional generalized varying-coefficient models
- Semi-varying coefficient models with a diverging number of components
- Estimation and inference in generalized additive coefficient models for nonlinear interactions with high-dimensional covariates
- Learning rates for the risk of kernel-based quantile regression estimators in additive models
- Bayesian nonlinear model selection for gene regulatory networks
- Semiparametric regression models with additive nonparametric components and high dimensional parametric components
- Dimension reduction and variable selection in case control studies via regularized likelihood optimization
- Generalization of constraints for high dimensional regression problems
- Nonparametric independence screening via favored smoothing bandwidth
- Variable selection in a partially linear proportional hazards model with a diverging dimensionality
- Bayesian quantile regression for partially linear additive models
- PAC-Bayesian estimation and prediction in sparse additive models
- Oracle inequalities and optimal inference under group sparsity
- Functional additive regression
- Oracle inequalities for sparse additive quantile regression in reproducing kernel Hilbert space
- Weakly decomposable regularization penalties and structured sparsity
- Automatic model selection for partially linear models
- Correlated variables in regression: clustering and sparse estimation
- A dimension reduction based approach for estimation and variable selection in partially linear single-index models with high-dimensional covariates
- Variable selection in nonparametric additive models
- Non-asymptotic oracle inequalities for the Lasso and group Lasso in high dimensional logistic model
- Learning non-parametric basis independent models from point queries via low-rank methods
- Minimax-optimal nonparametric regression in high dimensions
- High dimensional single index models
- Penalized likelihood and Bayesian function selection in regression models
- Asymptotics for penalised splines in generalised additive models
- Statistical inference in compound functional models
- Fast learning rate of multiple kernel learning: trade-off between sparsity and smoothness
- On the uniform convergence of empirical norms and inner products, with application to causal inference
- Polynomial spline estimation for generalized varying coefficient partially linear models with a diverging number of components
- An introduction to recent advances in high/infinite dimensional statistics
- Simultaneous confidence bands for sequential autoregressive fitting
- A selective review of group selection in high-dimensional models
- Spike-and-slab priors for function selection in structured additive regression models
- AdaptFitOS
- Consistency of sparse PCA in high dimension, low sample size contexts
- pacbpred
- COBRA
- npfda
- SpicyMKL
- spinyReg
- Estimation of a sparse group of sparse vectors
- LassoBacktracking
- R2BayesX
- hypergsplines
- Consistency of support vector machines using additive kernels for additive models
- Fixed and random effects selection in nonparametric additive mixed models
- Estimation by polynomial splines with variable selection in additive Cox models
- Accurate and robust tests for indirect inference
- Testing for additivity in non-parametric regression
- Minimax-optimal rates for sparse additive models over kernel classes via convex programming
- Additive model selection
- Sparsity in multiple kernel learning
- Sparse high-dimensional varying coefficient model: nonasymptotic minimax study
- On the \(L_p\) norms of kernel regression estimators for incomplete data with applications to classification
- Nonparametric independence screening in sparse ultra-high-dimensional additive models
- Model structure selection in single-index-coefficient regression models
- Fast learning rate of non-sparse multiple kernel learning and optimal regularization strategies
- CAM: causal additive models, high-dimensional order search and penalized regression
- High-dimensional Bayesian inference in nonparametric additive models
- A selective overview of feature screening for ultrahigh-dimensional data
- Transductive versions of the Lasso and the Dantzig selector
- SCAD-penalized regression in additive partially linear proportional hazards models with an ultra-high-dimensional linear part
- COBRA: a combined regression strategy
- On degeneracy and invariances of random fields paths with applications in Gaussian process modelling
- Variable selection for high dimensional Gaussian copula regression model: an adaptive hypothesis testing procedure
This page was built for software: hgam