High-dimensional local polynomial regression with variable selection and dimension reduction
From MaRDI portal
Publication:6089201
Recommendations
- Selection of variables and dimension reduction in high-dimensional non-parametric regression
- Local polynomials for variable selection
- Nonparametric variable selection and its application to additive models
- Simultaneous dimension reduction and variable selection in modeling high dimensional data
- Variable selection and estimation in high-dimensional partially linear models
Cites work
- scientific article; zbMATH DE number 991833 (Why is no real title available?)
- A nonlinear multi-dimensional variable selection method for high dimensional data: sparse MAVE
- ASYMPTOTIC DISTRIBUTIONS FOR TWO ESTIMATORS OF THE SINGLE-INDEX MODEL
- An Adaptive Estimation of Dimension Reduction Space
- An oracle property of the Nadaraya-Watson kernel estimator for high-dimensional nonparametric regression
- Comment
- Coordinate-independent sparse sufficient dimension reduction and variable selection
- Dimension reduction for conditional mean in regression
- Dimension reduction for nonelliptically distributed predictors
- Double sparsity kernel learning with automatic variable selection and data extraction
- Estimation and variable selection for partial linear single-index distortion measurement errors models
- GRID: a variable selection and structure discovery method for high dimensional nonparametric regression
- High-dimensional quantile varying-coefficient models with dimension reduction
- Model-Free Variable Selection
- On Principal Hessian Directions for Data Visualization and Dimension Reduction: Another Application of Stein's Lemma
- Penalized minimum average variance estimation
- Principal single-index varying-coefficient models for dimension reduction in quantile regression
- Rodeo: Sparse, greedy nonparametric regression
- Simultaneous estimation for semi-parametric multi-index models
- Single-index composite quantile regression with heteroscedasticity and general error distributions
- Single-index partially functional linear regression model
- Sliced Inverse Regression for Dimension Reduction
- Sparse sufficient dimension reduction
- Variable selection through adaptive MAVE
- Weak convergence and empirical processes. With applications to statistics
Cited in
(2)
This page was built for publication: High-dimensional local polynomial regression with variable selection and dimension reduction
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6089201)