Structure learning via unstructured kernel-based M-estimation
From MaRDI portal
Publication:6184881
DOI10.1214/23-ejs2153arXiv1901.00615OpenAlexW4387367870MaRDI QIDQ6184881
No author found.
Publication date: 5 January 2024
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1901.00615
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Model-Free Feature Screening for Ultrahigh-Dimensional Data
- Sure independence screening in generalized linear models with NP-dimensionality
- On constrained and regularized high-dimensional regression
- Learning sparse gradients for variable selection and dimension reduction
- Learning gradients on manifolds
- Regularization in kernel learning
- Controlling the false discovery rate via knockoffs
- Derivative reproducing properties for kernel methods in learning theory
- Variable selection in nonparametric additive models
- Feature elimination in kernel machines in moderately high dimensions
- Oracle inequalities for sparse additive quantile regression in reproducing kernel Hilbert space
- Optimal regression rates for SVMs using Gaussian kernels
- Quantile-adaptive model-free variable screening for high-dimensional heterogeneous data
- A Bernstein-type inequality for some mixing processes and dynamical systems with an application to learning
- Statistical consistency and asymptotic normality for high-dimensional robust \(M\)-estimators
- Interaction pursuit in high-dimensional multi-response regression via distance correlation
- Learning sparse conditional distribution: an efficient kernel-based approach
- Discovering model structure for partially linear models
- Nonparametric screening under conditional strictly convex loss for ultrahigh dimensional sparse data
- Nonparametric sparsity and regularization
- Linear or Nonlinear? Automatic Structure Discovery for Partially Linear Models
- A Selective Overview of Variable Selection in High Dimensional Feature Space (Invited Review Article)
- Can Tests for Jumps be Viewed as Tests for Clusters?
- Support Vector Machines
- Consistency of Support Vector Machines and Other Regularized Kernel Classifiers
- Sums of Powers in Large Finite Fields
- Kernel Distribution Embeddings: Universal Kernels, Characteristic Kernels and Kernel Metrics on Distributions
- Model-Free Feature Screening for Ultrahigh Dimensional Data through a Modified Blum-Kiefer-Rosenblatt Correlation
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- Feature Screening via Distance Correlation Learning
- Likelihood-Based Selection and Sharp Parameter Estimation
- Model Selection for High-Dimensional Quadratic Regression via Regularization
- Interaction Screening for Ultrahigh-Dimensional Data
- Variable selection for classification with derivative-induced regularization
- Efficient kernel-based variable selection with sparsistency
- Nonparametric Interaction Selection
- Robust Variable and Interaction Selection for Logistic Regression and General Index Models
- A Generic Sure Independence Screening Procedure
- Variable Selection Using Adaptive Nonlinear Interaction Structures in High Dimensions
- Model-Free Feature Screening for Ultrahigh Dimensional Discriminant Analysis
- High Dimensional Ordinary Least Squares Projection for Screening Variables
- The Kolmogorov filter for variable screening in high-dimensional binary classification
- Variable Selection for Support Vector Machines in Moderately High Dimensions
- Individualized Multidirectional Variable Selection