Feature selection in machine learning: an exact penalty approach using a difference of convex function algorithm
From MaRDI portal
Publication:890292
DOI10.1007/s10994-014-5455-yzbMath1343.68201OpenAlexW1979657638MaRDI QIDQ890292
Hoai Minh Le, Hoai An Le Thi, Tao Pham Dinh
Publication date: 10 November 2015
Published in: Machine Learning (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10994-014-5455-y
Ridge regression; shrinkage estimators (Lasso) (62J07) Convex programming (90C25) Learning and adaptive systems in artificial intelligence (68T05)
Related Items
A unified DC programming framework and efficient DCA based approaches for large scale batch reinforcement learning, A truncated Newton algorithm for nonconvex sparse recovery, Phase-only transmit beampattern design for large phased array antennas with multi-point nulling, DC approximation approaches for sparse optimization, Difference of convex functions algorithms (DCA) for image restoration via a Markov random field model, DC programming and DCA for solving Brugnano-Casulli piecewise linear systems, A model-free variable selection method for reducing the number of redundant variables, A Continuous Exact $\ell_0$ Penalty (CEL0) for Least Squares Regularized Problem, Zero-norm regularized problems: equivalent surrogates, proximal MM method and statistical error bound, MAP inference algorithms without approximation for collective graphical models on path graphs via discrete difference of convex algorithm, A DC Programming Approach for Sparse Estimation of a Covariance Matrix, Sum-of-squares relaxations in robust DC optimization and feature selection, A Unified View of Exact Continuous Penalties for $\ell_2$-$\ell_0$ Minimization, Open issues and recent advances in DC programming and DCA, Efficient Nonnegative Matrix Factorization by DC Programming and DCA, Sparse Covariance Matrix Estimation by DCA-Based Algorithms, Cost-sensitive feature selection for support vector machines, Reproducing kernels and choices of associated feature spaces, in the form of \(L^2\)-spaces, A new approach for solving mixed integer DC programs using a continuous relaxation with no integrality gap and smoothing techniques, DCA based algorithms for feature selection in multi-class support vector machine, DC programming and DCA: thirty years of developments, Group Sparse Optimization for Images Recovery Using Capped Folded Concave Functions, A unified Douglas-Rachford algorithm for generalized DC programming
Cites Work
- The Adaptive Lasso and Its Oracle Properties
- Exact penalty and error bounds in DC programming
- Enhancing sparsity by reweighted \(\ell _{1}\) minimization
- One-step sparse estimates in nonconcave penalized likelihood models
- A bilinear formulation for vector sparsity optimization
- Relaxed Lasso
- On the approximability of minimizing nonzero variables or unsatisfied relations in linear systems
- Solving a class of linearly constrained indefinite quadratic problems by DC algorithms
- Sparse high-dimensional fractional-norm support vector machine via DC programming
- The DC (Difference of convex functions) programming and DCA revisited with DC models of real world nonconvex optimization problems
- Asymptotics for Lasso-type estimators.
- A new efficient algorithm based on DC programming and DCA for clustering
- Asymptotic properties of bridge estimators in sparse high-dimensional regression models
- A DC programming approach for feature selection in support vector machines learning
- Optimization based DC programming and DCA for hierarchical clustering
- An affine scaling methodology for best basis selection
- Learning sparse classifiers with difference of convex functions algorithms
- Optimization with Sparsity-Inducing Penalties
- Lower Bound Theory of Nonzero Entries in Solutions of $\ell_2$-$\ell_p$ Minimization
- Sparse representations in unions of bases
- A D.C. Optimization Algorithm for Solving the Trust-Region Subproblem
- The Concave-Convex Procedure
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Recovering Sparse Signals With a Certain Family of Nonconvex Penalties and DC Programming
- 10.1162/153244303322753751
- Sparse Approximate Solutions to Linear Systems
- Matching pursuits with time-frequency dictionaries
- Survey and Taxonomy of Feature Selection Algorithms in Intrusion Detection System
- Regularization and Variable Selection Via the Elastic Net
- Smoothly Clipped Absolute Deviation on High Dimensions
- Multicategory ψ-Learning
- Combined SVM-based feature selection and classification
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item