GRID: a variable selection and structure discovery method for high dimensional nonparametric regression
From MaRDI portal
Publication:2196249
Recommendations
- GRID for variable selection in high dimensional regression
- Selection of variables and dimension reduction in high-dimensional non-parametric regression
- Component Identification and Estimation in Nonlinear High-Dimensional Regression Models by Structural Adaptation
- Flexible variable selection for recovering sparsity in nonadditive nonparametric models
- Variable selection using adaptive nonlinear interaction structures in high dimensions
Cites work
- scientific article; zbMATH DE number 3862231 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- A review on empirical likelihood methods for regression
- Asymptotic properties of backfitting estimators
- Bias-corrected inference for multivariate nonparametric regression: model selection and oracle property
- Component selection and smoothing in multivariate nonparametric regression
- DASSO: Connections Between the Dantzig Selector and Lasso
- Design-adaptive Nonparametric Regression
- Empirical likelihood
- Empirical likelihood and general estimating equations
- Empirical likelihood confidence intervals for local linear smoothers
- Empirical likelihood ratio confidence intervals for a single functional
- High-dimensional additive modeling
- High-dimensional graphs and variable selection with the Lasso
- Linear or nonlinear? Automatic structure discovery for partially linear models
- Local linear regression smoothers and their minimax efficiencies
- Local polynomial fitting based on empirical likelihood
- Local polynomial regression: Optimal kernels and asymptotic minimax efficiency
- Marginal empirical likelihood and sure independence feature screening
- Minimax-optimal nonparametric regression in high dimensions
- Multivariate locally weighted least squares regression
- Nearly unbiased variable selection under minimax concave penalty
- Nonparametric Inferences for Additive Models
- Optimal global rates of convergence for nonparametric regression
- Polynomial splines and their tensor products in extended linear modeling. (With discussions)
- Rodeo: Sparse, greedy nonparametric regression
- Selection of variables and dimension reduction in high-dimensional non-parametric regression
- Sparse additive models
- Sparse inverse covariance estimation with the graphical lasso
- Sure independence screening for ultrahigh dimensional feature space. With discussion and authors' reply
- Surface estimation, variable selection, and the nonparametric oracle property
- The Adaptive Lasso and Its Oracle Properties
- Tight conditions for consistency of variable selection in the context of high dimensionality
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Variable selection using adaptive nonlinear interaction structures in high dimensions
- Variable selection with the strong heredity constraint and its oracle property
Cited in
(5)- Linear and nonlinear signal detection and estimation in high-dimensional nonparametric regression under weak sparsity
- A nonparametric procedure for linear and nonlinear variable screening
- High-dimensional local linear regression under sparsity and convex losses
- High-dimensional local polynomial regression with variable selection and dimension reduction
- GRID for variable selection in high dimensional regression
This page was built for publication: GRID: a variable selection and structure discovery method for high dimensional nonparametric regression
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2196249)