Another look at linear programming for feature selection via methods of regularization
From MaRDI portal
Publication:746339
DOI10.1007/s11222-013-9408-2zbMath1322.90047OpenAlexW1967306940MaRDI QIDQ746339
Publication date: 16 October 2015
Published in: Statistics and Computing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11222-013-9408-2
quantile regressionsimplex methodsupport vector machinesparametric linear programming\(l_1\)-norm penaltygrouped regularizationstructured learning
Related Items (2)
Multi-parametric solution-path algorithm for instance-weighted support vector machines ⋮ On support vector machines under a multiple-cost scenario
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A new polynomial-time algorithm for linear programming
- Component selection and smoothing in multivariate nonparametric regression
- The composite absolute penalties family for grouped and hierarchical variable selection
- Least angle regression. (With discussion)
- Support-vector networks
- Pathwise coordinate optimization
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- An exact primal-dual penalty method approach to warmstarting interior-point methods for linear programming
- Piecewise linear regularized solution paths
- Linear Programming Techniques for Regression Analysis
- A New Unblocking Technique to Warmstart Interior Point Methods Based on Sensitivity Analysis
- Structured multicategory support vector machines with analysis of variance decomposition
- Least Absolute Deviations Curve-Fitting
- On the Implementation of a Primal-Dual Interior Point Method
- Regression Quantiles
- A new approach to variable selection in least squares problems
- Sparsity and Smoothness Via the Fused Lasso
- L 1-Regularization Path Algorithm for Generalized Linear Models
- A Technique for Resolving Degeneracy in Linear Programming
- Parametric Objective Function (Part 1)
- Parametric Objective Function (Part 2)—Generalization
- Model Selection and Estimation in Regression with Grouped Variables
- Simultaneous Regression Shrinkage, Variable Selection, and Supervised Clustering of Predictors with OSCAR
- Ridge Regression: Biased Estimation for Nonorthogonal Problems
- An Improved Algorithm for Discrete $l_1 $ Linear Approximation
- The elements of statistical learning. Data mining, inference, and prediction
- Linear programming. Foundations and extensions
- Structural modelling with sparse kernels
This page was built for publication: Another look at linear programming for feature selection via methods of regularization