Feature selection in SVM via polyhedral k-norm
DOI10.1007/S11590-019-01482-1zbMATH Open1433.90133OpenAlexW2908760699WikidataQ127229552 ScholiaQ127229552MaRDI QIDQ2300635FDOQ2300635
Authors: Manlio Gaudioso, E. Gorgone, Jean-Baptiste Hiriart-Urruty
Publication date: 27 February 2020
Published in: Optimization Letters (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11590-019-01482-1
Recommendations
- A DC programming approach for feature selection in support vector machines learning
- DCA based algorithms for feature selection in multi-class support vector machine
- Sparse high-dimensional fractional-norm support vector machine via DC programming
- Feature selection in machine learning: an exact penalty approach using a difference of convex function algorithm
- D.C. programming for sparse proximal support vector machines
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Nonconvex programming, global optimization (90C26) Combinatorial optimization (90C27)
Cites Work
- 10.1162/153244303322753616
- Title not available (Why is that?)
- Title not available (Why is that?)
- Regularization and Variable Selection Via the Elastic Net
- Title not available (Why is that?)
- Global optimality conditions for nonconvex optimization
- Accelerated block-coordinate relaxation for regularized optimization
- Recovering Sparse Signals With a Certain Family of Nonconvex Penalties and DC Programming
- Title not available (Why is that?)
- The DC (Difference of convex functions) programming and DCA revisited with DC models of real world nonconvex optimization problems
- Sensitivity analysis of all eigenvalues of a symmetric matrix
- Exact Penalty Functions in Constrained Optimization
- 10.1162/153244303322753751
- A DC programming approach for feature selection in support vector machines learning
- Concave programming for minimizing the zero-norm over polyhedral sets
- Integer programming models for feature selection: new extensions and a randomized solution algorithm
- Feature selection for support vector machines via mixed integer linear programming
- On the approximability of minimizing nonzero variables or unsatisfied relations in linear systems
- Feature Selection via Mathematical Programming
- Optimality conditions and duality theory for minimizing sums of the largest eigenvalues of symmetric matrices
- Feature selection for unsupervised learning
- On the Moreau-Yosida regularization of the vector \(k\)-norm related functions
- The doubly regularized support vector machine
- Minimizing Piecewise-Concave Functions Over Polyhedra
- A unified view of exact continuous penalties for \(\ell_2\)-\(\ell_0\) minimization
- Minimizing nonsmooth DC functions via successive DC piecewise-affine approximations
- A proximal bundle method for nonsmooth DC optimization utilizing nonconvex cutting planes
- Linear best approximation using a class of polyhedral norms
- DC formulations and algorithms for sparse optimization problems
- Lagrangian relaxation for SVM feature selection
- Sparse learning via Boolean relaxations
Cited In (21)
- Sum-of-squares relaxations in robust DC optimization and feature selection
- Proximal operator and optimality conditions for ramp loss SVM
- A three-operator splitting algorithm with deviations for generalized DC programming
- Title not available (Why is that?)
- Dual formulation of the sparsity constrained optimization problem: application to classification
- 10.1162/153244303322753751
- Difference of Convex programming in adversarial SVM
- Solving nonnegative sparsity-constrained optimization via DC quadratic-piecewise-linear approximations
- Convex optimization for group feature selection in networked data
- Deforming \(\|.\|_1\) into \(\|.\|_{\infty}\) via polyhedral norms: a pedestrian approach
- Lagrangian relaxation for SVM feature selection
- Feature selection for linear SVMs under uncertain data: robust optimization based on difference of convex functions algorithms
- Feature selection in machine learning: an exact penalty approach using a difference of convex function algorithm
- DCA for Gaussian kernel support vector machines with feature selection
- \(l_{0}\)-norm based structural sparse least square regression for feature selection
- The \(F_{\infty}\)-norm support vector machine
- Polytopal balls arising in optimization
- DCA based algorithms for feature selection in multi-class support vector machine
- A DC programming approach for feature selection in support vector machines learning
- Essentials of numerical nonsmooth optimization
- Sparse optimization via vector \(k\)-norm and DC programming with an application to feature selection for support vector machines
This page was built for publication: Feature selection in SVM via polyhedral \(k\)-norm
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2300635)