Is mutual information adequate for feature selection in regression?
DOI10.1016/J.NEUNET.2013.07.003zbMATH Open1297.68202DBLPjournals/nn/FrenayDV13OpenAlexW2165143551WikidataQ45211768 ScholiaQ45211768MaRDI QIDQ460666FDOQ460666
Authors: Benoît Frénay, Gauthier Doquire, Michel Verleysen
Publication date: 14 October 2014
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.neunet.2013.07.003
Recommendations
Statistical aspects of information-theoretic topics (62B10) Learning and adaptive systems in artificial intelligence (68T05) Measures of information, entropy (94A17)
Cites Work
- Wrappers for feature subset selection
- Title not available (Why is that?)
- 10.1162/153244303322753616
- Title not available (Why is that?)
- A Mathematical Theory of Communication
- Sample estimate of the entropy of a random vector
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Mutual Information and Minimum Mean-Square Error in Gaussian Channels
- 10.1162/153244303322753689
Cited In (20)
- TCMI: a non-parametric mutual-dependence estimator for multivariate continuous distributions
- An adaptive heuristic for feature selection based on complementarity
- Quantum-enhanced feature selection with forward selection and backward elimination
- Increasing and decreasing returns and losses in mutual information feature subset selection
- Estimating mutual information for feature selection in the presence of label noise
- Relevance measures for subset variable selection in regression problems based on \(k\)-additive mutual information
- Conditional likelihood maximisation: a unifying framework for information theoretic feature selection
- Subset selection algorithm based on mutual information
- Feature selection using mutual information and neural networks
- Comments on supervised feature selection by clustering using conditional mutual information-based distances
- Image based techniques for crack detection, classification and quantification in asphalt pavement: a review
- An efficient method for feature selection in linear regression based on an extended Akaike information criterion
- Jointly informative feature selection made tractable by Gaussian modeling
- Maximum relevance minimum common redundancy feature selection for nonlinear data
- A screening-based gradient-enhanced Kriging modeling method for high-dimensional problems
- Feature selection based on statistical estimation of mutual information
- A novel quality prediction method based on feature selection considering high dimensional product quality data
- Dimensionality reduction by feature clustering for regression problems
- Can high-order dependencies improve mutual information based feature selection?
- Feature selection with dynamic mutual information
This page was built for publication: Is mutual information adequate for feature selection in regression?
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q460666)