Efficient kernel-based variable selection with sparsistency
From MaRDI portal
Publication:5037806
DOI10.5705/ss.202019.0401OpenAlexW3126888773MaRDI QIDQ5037806
Junhui Wang, Xin He, Shao-Gao Lv
Publication date: 4 March 2022
Published in: Statistica Sinica (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1802.09246
Related Items (3)
Robust partially linear trend filtering for regression estimation and structure discovery ⋮ Structure learning via unstructured kernel-based M-estimation ⋮ Discovering model structure for partially linear models
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Measuring and testing dependence by correlation of distances
- Nearly unbiased variable selection under minimax concave penalty
- The Adaptive Lasso and Its Oracle Properties
- On constrained and regularized high-dimensional regression
- Regularization in kernel learning
- Component selection and smoothing in multivariate nonparametric regression
- Controlling the false discovery rate via knockoffs
- Derivative reproducing properties for kernel methods in learning theory
- Variable selection in nonparametric additive models
- Feature elimination in kernel machines in moderately high dimensions
- Oracle inequalities for sparse additive quantile regression in reproducing kernel Hilbert space
- Quantile-adaptive model-free variable screening for high-dimensional heterogeneous data
- Interaction pursuit in high-dimensional multi-response regression via distance correlation
- Randomized sketches for kernels: fast and optimal nonparametric regression
- Shannon sampling. II: Connections to learning theory
- Contour regression: a general approach to dimension reduction
- Learning theory estimates via integral operators and their approximations
- Nonparametric sparsity and regularization
- Forward Regression for Ultra-High Dimensional Variable Screening
- Nonparametric Independence Screening in Sparse Ultra-High-Dimensional Additive Models
- Support Vector Machines
- Shrinkage Inverse Regression Estimation for Model-Free Variable Selection
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Variable Selection and Function Estimation in Additive Nonparametric Regression Using a Data-Based Prior
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- Group Regularized Estimation Under Structural Hierarchy
- 10.1162/153244303321897690
- Likelihood-Based Selection and Sharp Parameter Estimation
- Model Selection for High-Dimensional Quadratic Regression via Regularization
- Gradient-Based Kernel Dimension Reduction for Regression
- Variable Selection in Nonparametric Classification Via Measurement Error Model Selection Likelihoods
- Interaction Screening for Ultrahigh-Dimensional Data
- Variable Selection With the Strong Heredity Constraint and Its Oracle Property
- Variable Selection Using Adaptive Nonlinear Interaction Structures in High Dimensions
- Automatic structure recovery for additive models
- High Dimensional Ordinary Least Squares Projection for Screening Variables
This page was built for publication: Efficient kernel-based variable selection with sparsistency