Feature selection in SVM via polyhedral k-norm
From MaRDI portal
Publication:2300635
Recommendations
- A DC programming approach for feature selection in support vector machines learning
- DCA based algorithms for feature selection in multi-class support vector machine
- Sparse high-dimensional fractional-norm support vector machine via DC programming
- Feature selection in machine learning: an exact penalty approach using a difference of convex function algorithm
- D.C. programming for sparse proximal support vector machines
Cites work
- scientific article; zbMATH DE number 823069 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- scientific article; zbMATH DE number 6438182 (Why is no real title available?)
- scientific article; zbMATH DE number 3308846 (Why is no real title available?)
- 10.1162/153244303322753616
- 10.1162/153244303322753751
- A DC programming approach for feature selection in support vector machines learning
- A proximal bundle method for nonsmooth DC optimization utilizing nonconvex cutting planes
- A unified view of exact continuous penalties for \(\ell_2\)-\(\ell_0\) minimization
- Accelerated block-coordinate relaxation for regularized optimization
- Concave programming for minimizing the zero-norm over polyhedral sets
- DC formulations and algorithms for sparse optimization problems
- Exact Penalty Functions in Constrained Optimization
- Feature Selection via Mathematical Programming
- Feature selection for support vector machines via mixed integer linear programming
- Feature selection for unsupervised learning
- Global optimality conditions for nonconvex optimization
- Integer programming models for feature selection: new extensions and a randomized solution algorithm
- Lagrangian relaxation for SVM feature selection
- Linear best approximation using a class of polyhedral norms
- Minimizing Piecewise-Concave Functions Over Polyhedra
- Minimizing nonsmooth DC functions via successive DC piecewise-affine approximations
- On the Moreau-Yosida regularization of the vector \(k\)-norm related functions
- On the approximability of minimizing nonzero variables or unsatisfied relations in linear systems
- Optimality conditions and duality theory for minimizing sums of the largest eigenvalues of symmetric matrices
- Recovering Sparse Signals With a Certain Family of Nonconvex Penalties and DC Programming
- Regularization and Variable Selection Via the Elastic Net
- Sensitivity analysis of all eigenvalues of a symmetric matrix
- Sparse learning via Boolean relaxations
- The DC (Difference of convex functions) programming and DCA revisited with DC models of real world nonconvex optimization problems
- The doubly regularized support vector machine
Cited in
(21)- Essentials of numerical nonsmooth optimization
- Sparse optimization via vector \(k\)-norm and DC programming with an application to feature selection for support vector machines
- Sum-of-squares relaxations in robust DC optimization and feature selection
- Proximal operator and optimality conditions for ramp loss SVM
- scientific article; zbMATH DE number 1974047 (Why is no real title available?)
- A three-operator splitting algorithm with deviations for generalized DC programming
- Dual formulation of the sparsity constrained optimization problem: application to classification
- 10.1162/153244303322753751
- Difference of Convex programming in adversarial SVM
- Solving nonnegative sparsity-constrained optimization via DC quadratic-piecewise-linear approximations
- Convex optimization for group feature selection in networked data
- Lagrangian relaxation for SVM feature selection
- Deforming \(\|.\|_1\) into \(\|.\|_{\infty}\) via polyhedral norms: a pedestrian approach
- Feature selection for linear SVMs under uncertain data: robust optimization based on difference of convex functions algorithms
- Feature selection in machine learning: an exact penalty approach using a difference of convex function algorithm
- \(l_{0}\)-norm based structural sparse least square regression for feature selection
- DCA for Gaussian kernel support vector machines with feature selection
- The \(F_{\infty}\)-norm support vector machine
- Polytopal balls arising in optimization
- DCA based algorithms for feature selection in multi-class support vector machine
- A DC programming approach for feature selection in support vector machines learning
This page was built for publication: Feature selection in SVM via polyhedral \(k\)-norm
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2300635)