Is mutual information adequate for feature selection in regression?
From MaRDI portal
Publication:460666
DOI10.1016/j.neunet.2013.07.003zbMath1297.68202OpenAlexW2165143551WikidataQ45211768 ScholiaQ45211768MaRDI QIDQ460666
Benoît Frénay, Gauthier Doquire, Michel Verleysen
Publication date: 14 October 2014
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.neunet.2013.07.003
Learning and adaptive systems in artificial intelligence (68T05) Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10)
Related Items
An adaptive heuristic for feature selection based on complementarity, Quantum-enhanced feature selection with forward selection and backward elimination, Dimensionality reduction by feature clustering for regression problems, Image based techniques for crack detection, classification and quantification in asphalt pavement: a review, A screening-based gradient-enhanced Kriging modeling method for high-dimensional problems, A novel quality prediction method based on feature selection considering high dimensional product quality data
Cites Work
- A Mathematical Theory of Communication
- Sample estimate of the entropy of a random vector
- Wrappers for feature subset selection
- Mutual Information and Minimum Mean-Square Error in Gaussian Channels
- 10.1162/153244303322753616
- 10.1162/153244303322753689
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item