An error bound for L₁-norm support vector machine coefficients in ultra-high dimension
From MaRDI portal
Publication:2958600
Recommendations
- Divide-and-conquer for debiased \(l_1\)-norm support vector machine in ultra-high dimensions
- On the sparseness of 1-norm support vector machines
- OnL1-Norm Multiclass Support Vector Machines
- Variable selection for support vector machines in moderately high dimensions
- The doubly regularized support vector machine
Cites work
- scientific article; zbMATH DE number 5957245 (Why is no real title available?)
- scientific article; zbMATH DE number 823069 (Why is no real title available?)
- A Bahadur representation of the linear support vector machine
- A consistent information criterion for support vector machines in diverging model spaces
- A constrained \(\ell _{1}\) minimization approach to sparse precision matrix estimation
- Best subset selection, persistence in high-dimensional statistical learning and optimization under l₁ constraint
- Classifiers of support vector machine type with \(\ell_1\) complexity regularization
- Covering a sphere with spheres
- Dimension reduction in regression without matrix inversion
- Extended Bayesian information criteria for model selection with large model spaces
- Gene selection for cancer classification using support vector machines
- High-dimensional generalized linear models and the lasso
- Lasso-type recovery of sparse representations for high-dimensional data
- Nearly unbiased variable selection under minimax concave penalty
- New Bounds for Restricted Isometry Constants
- New volume ratio properties for convex symmetric bodies in \({\mathbb{R}}^ n\)
- OnL1-Norm Multiclass Support Vector Machines
- Oracle properties of SCAD-penalized support vector machine
- Quantile Regression for Analyzing Heterogeneity in Ultra-High Dimension
- Simultaneous analysis of Lasso and Dantzig selector
- Support vector machines with a reject option
- The \(L_1\) penalized LAD estimator for high dimensional linear regression
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- \(\ell_1\)-penalized quantile regression in high-dimensional sparse models
Cited in
(18)- Are Latent Factor Regression and Sparse Regression Adequate?
- Rank-Based Greedy Model Averaging for High-Dimensional Survival Data
- Learning rates for partially linear support vector machine in high dimensions
- Tropical Support Vector Machines: Evaluations and Extension to Function Spaces
- Non-asymptotic Analysis of $\ell_1$-norm Support Vector Machines
- Sparse concordance-assisted learning for optimal treatment decision
- Exploring the trade-off between generalization and empirical errors in a one-norm SVM
- scientific article; zbMATH DE number 7415078 (Why is no real title available?)
- The backbone method for ultra-high dimensional sparse machine learning
- Distributed inference for linear support vector machine
- The statistical rate for support matrix machines under low rankness and row (column) sparsity
- Support vector machine in big data: smoothing strategy and adaptive distributed inference
- Quadratic surface support vector machine with L1 norm regularization
- Best subset selection for high-dimensional non-smooth models using iterative hard thresholding
- Sparse additive support vector machines in bounded variation space
- Non-convex penalized multitask regression using data depth-based penalties
- Divide-and-conquer for debiased \(l_1\)-norm support vector machine in ultra-high dimensions
- High-Dimensional Learning Under Approximate Sparsity with Applications to Nonsmooth Estimation and Regularized Neural Networks
This page was built for publication: An error bound for \(L_1\)-norm support vector machine coefficients in ultra-high dimension
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2958600)