Feature selection in SVM via polyhedral \(k\)-norm
From MaRDI portal
Publication:2300635
DOI10.1007/s11590-019-01482-1zbMath1433.90133OpenAlexW2908760699WikidataQ127229552 ScholiaQ127229552MaRDI QIDQ2300635
Jean-Baptiste Hiriart-Urruty, Enrico Gorgone, Manlio Gaudioso
Publication date: 27 February 2020
Published in: Optimization Letters (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11590-019-01482-1
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Nonconvex programming, global optimization (90C26) Combinatorial optimization (90C27)
Related Items (8)
Proximal operator and optimality conditions for ramp loss SVM ⋮ Deforming $||.||_{1}$ into $||.||_{\infty}$ via Polyhedral Norms: A Pedestrian Approach ⋮ Sparse optimization via vector \(k\)-norm and DC programming with an application to feature selection for support vector machines ⋮ Polytopal balls arising in optimization ⋮ Sum-of-squares relaxations in robust DC optimization and feature selection ⋮ A three-operator splitting algorithm with deviations for generalized DC programming ⋮ Essentials of numerical nonsmooth optimization ⋮ Solving nonnegative sparsity-constrained optimization via DC quadratic-piecewise-linear approximations
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Integer programming models for feature selection: new extensions and a randomized solution algorithm
- Feature selection for support vector machines via mixed integer linear programming
- Concave programming for minimizing the zero-norm over polyhedral sets
- Linear best approximation using a class of polyhedral norms
- Global optimality conditions for nonconvex optimization
- On the approximability of minimizing nonzero variables or unsatisfied relations in linear systems
- Optimality conditions and duality theory for minimizing sums of the largest eigenvalues of symmetric matrices
- Sensitivity analysis of all eigenvalues of a symmetric matrix
- Lagrangian relaxation for SVM feature selection
- DC formulations and algorithms for sparse optimization problems
- Minimizing nonsmooth DC functions via successive DC piecewise-affine approximations
- The DC (Difference of convex functions) programming and DCA revisited with DC models of real world nonconvex optimization problems
- Sparse learning via Boolean relaxations
- A proximal bundle method for nonsmooth DC optimization utilizing nonconvex cutting planes
- A DC programming approach for feature selection in support vector machines learning
- Accelerated Block-coordinate Relaxation for Regularized Optimization
- On the Moreau--Yosida Regularization of the Vector $k$-Norm Related Functions
- Feature Selection via Mathematical Programming
- Recovering Sparse Signals With a Certain Family of Nonconvex Penalties and DC Programming
- Exact Penalty Functions in Constrained Optimization
- 10.1162/153244303322753616
- 10.1162/153244303322753751
- Minimizing Piecewise-Concave Functions Over Polyhedra
- Regularization and Variable Selection Via the Elastic Net
- A Unified View of Exact Continuous Penalties for $\ell_2$-$\ell_0$ Minimization
This page was built for publication: Feature selection in SVM via polyhedral \(k\)-norm