High-dimensional local polynomial regression with variable selection and dimension reduction
DOI10.1007/S11222-023-10308-1zbMATH Open1523.62010OpenAlexW4387705107MaRDI QIDQ6089201FDOQ6089201
Publication date: 17 November 2023
Published in: Statistics and Computing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11222-023-10308-1
Recommendations
- Selection of variables and dimension reduction in high-dimensional non-parametric regression
- Local polynomials for variable selection
- Nonparametric variable selection and its application to additive models
- Simultaneous dimension reduction and variable selection in modeling high dimensional data
- Variable selection and estimation in high-dimensional partially linear models
Computational methods for problems pertaining to statistics (62-08) Nonparametric regression and quantile regression (62G08) Asymptotic properties of nonparametric inference (62G20) Estimation in multivariate analysis (62H12)
Cites Work
- Weak convergence and empirical processes. With applications to statistics
- Title not available (Why is that?)
- Sliced Inverse Regression for Dimension Reduction
- Comment
- An Adaptive Estimation of Dimension Reduction Space
- Dimension reduction for nonelliptically distributed predictors
- Model-Free Variable Selection
- Dimension reduction for conditional mean in regression
- ASYMPTOTIC DISTRIBUTIONS FOR TWO ESTIMATORS OF THE SINGLE-INDEX MODEL
- On Principal Hessian Directions for Data Visualization and Dimension Reduction: Another Application of Stein's Lemma
- Sparse sufficient dimension reduction
- Single-index composite quantile regression with heteroscedasticity and general error distributions
- Coordinate-independent sparse sufficient dimension reduction and variable selection
- Rodeo: Sparse, greedy nonparametric regression
- A nonlinear multi-dimensional variable selection method for high dimensional data: sparse MAVE
- Single-index partially functional linear regression model
- High-dimensional quantile varying-coefficient models with dimension reduction
- Principal single-index varying-coefficient models for dimension reduction in quantile regression
- Double sparsity kernel learning with automatic variable selection and data extraction
- Penalized minimum average variance estimation
- Simultaneous estimation for semi-parametric multi-index models
- Estimation and variable selection for partial linear single-index distortion measurement errors models
- GRID: a variable selection and structure discovery method for high dimensional nonparametric regression
- Variable selection through adaptive MAVE
- An oracle property of the Nadaraya–Watson kernel estimator for high‐dimensional nonparametric regression
This page was built for publication: High-dimensional local polynomial regression with variable selection and dimension reduction
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6089201)