Feature elimination in kernel machines in moderately high dimensions
From MaRDI portal
Abstract: We develop an approach for feature elimination in statistical learning with kernel machines, based on recursive elimination of features.We present theoretical properties of this method and show that it is uniformly consistent in finding the correct feature space under certain generalized assumptions.We present four case studies to show that the assumptions are met in most practical situations and present simulation results to demonstrate performance of the proposed approach.
Recommendations
Cites work
- 10.1162/153244303322753706
- 10.1162/153244303322753751
- Fast rates for support vector machines using Gaussian kernels
- Feature elimination in kernel machines in moderately high dimensions
- Feature screening via distance correlation learning
- Gene selection for cancer classification using support vector machines
- Optimal aggregation of classifiers in statistical learning.
- Principal component analysis.
- Support Vector Machines
- Support vector machines with adaptive \(L_q\) penalty
- Support vector regression for right censored data
- Sure independence screening for ultrahigh dimensional feature space. With discussion and authors' reply
- The doubly regularized support vector machine
- Universal kernels
- Variable selection in kernel Fisher discriminant analysis by means of recursive feature elimina\-tion
Cited in
(11)- Sequential safe feature elimination rule for \(L_1\)-regularized regression with Kullback-Leibler divergence
- Feature elimination in kernel machines in moderately high dimensions
- scientific article; zbMATH DE number 5887324 (Why is no real title available?)
- Receiver operating characteristic curves and confidence bands for support vector machines
- Kernel variable selection for multicategory support vector machines
- Efficient kernel-based variable selection with sparsistency
- Structure learning via unstructured kernel-based M-estimation
- High-Dimensional Feature Selection by Feature-Wise Kernelized Lasso
- Safe feature elimination for non-negativity constrained convex optimization
- Sparse nonparametric regression with regularized tensor product kernel
- Net benefit index: Assessing the influence of a biomarker for individualized treatment rules
This page was built for publication: Feature elimination in kernel machines in moderately high dimensions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1731769)