Evaluating and selecting features via information theoretic lower bounds of feature inner correlations for high-dimensional data
From MaRDI portal
Publication:2029330
DOI10.1016/j.ejor.2020.09.028zbMath1487.62166OpenAlexW3092028662MaRDI QIDQ2029330
Publication date: 3 June 2021
Published in: European Journal of Operational Research (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.ejor.2020.09.028
Measures of association (correlation, canonical correlation, etc.) (62H20) Learning and adaptive systems in artificial intelligence (68T05) Statistical aspects of big data and data science (62R07)
Related Items (1)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Feature selection for support vector machines using generalized Benders decomposition
- Integer programming models for feature selection: new extensions and a randomized solution algorithm
- Stochastic local search for the FEATURE SET problem, with applications to microarray data
- Selection of relevant features and examples in machine learning
- Theoretical and empirical analysis of ReliefF and RReliefF
- High dimensional data classification and feature selection using support vector machines
- Cost-based feature selection for support vector machines: an application in credit scoring
- Can high-order dependencies improve mutual information based feature selection?
- Advanced conjoint analysis using feature selection via support vector machines
- Lower Bound Theory of Nonzero Entries in Solutions of $\ell_2$-$\ell_p$ Minimization
- Convex Optimization for Group Feature Selection in Networked Data
This page was built for publication: Evaluating and selecting features via information theoretic lower bounds of feature inner correlations for high-dimensional data