TCMI: a non-parametric mutual-dependence estimator for multivariate continuous distributions
DOI10.1007/S10618-022-00847-YOpenAlexW4287898779WikidataQ114859253 ScholiaQ114859253MaRDI QIDQ2097447FDOQ2097447
Authors: Benjamin Regler, Matthias Scheffler, Luca M. Ghiringhelli
Publication date: 14 November 2022
Published in: Data Mining and Knowledge Discovery (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2001.11212
Recommendations
- Can high-order dependencies improve mutual information based feature selection?
- Distribution of mutual information from complete and incomplete data
- Feature Selection for Classificatory Analysis Based on Information-theoretic Criteria
- Feature selection with dynamic mutual information
- Is mutual information adequate for feature selection in regression?
feature selectionmachine learningmutual informationinformation theorydependence measurematerials science
Learning and adaptive systems in artificial intelligence (68T05) Estimation in multivariate analysis (62H12) Measures of association (correlation, canonical correlation, etc.) (62H20) Measures of information, entropy (94A17)
Cites Work
- An introduction to statistical learning. With applications in R
- Greedy function approximation: A gradient boosting machine.
- Feature selection toolbox software package
- Measuring and testing dependence by correlation of distances
- Selection of relevant features and examples in machine learning
- Wrappers for feature subset selection
- Title not available (Why is that?)
- 10.1162/153244303322753616
- Elements of Information Theory
- Multivariate adaptive regression splines
- Partial distance correlation with methods for dissimilarities
- Detecting novel associations in large data sets
- Title not available (Why is that?)
- On Information and Sufficiency
- A Mathematical Theory of Communication
- Sample estimate of the entropy of a random vector
- Title not available (Why is that?)
- Information theoretic measures for clusterings comparison: variants, properties, normalization and correction for chance
- Multivariate extensions of Spearman's rho and related statistics
- A fast and objective multidimensional kernel density estimation method: fastKDE
- Do we need hundreds of classifiers to solve real world classification problems?
- An Automatic Method of Solving Discrete Programming Problems
- On quantifying dependence: a framework for developing interpretable measures
- Title not available (Why is that?)
- More on a new concept of entropy and information
- Cumulative Residual Entropy: A New Measure of Information
- Title not available (Why is that?)
- On cumulative entropies
- Title not available (Why is that?)
- A Branch and Bound Algorithm for Feature Subset Selection
- A Direct Method of Nonparametric Measurement Selection
- Multivariante information transmission
- Title not available (Why is that?)
- Title not available (Why is that?)
- Learning Boolean concepts in the presence of many irrelevant features
- Title not available (Why is that?)
- Information Theoretical Analysis of Multivariate Correlation
- Unsupervised interaction-preserving discretization of multivariate data
- Mathematical statistics. An introduction to likelihood based inference
- Branch-and-bound algorithms: a survey of recent advances in searching, branching, and pruning
- Robust smoothing of gridded data in one and higher dimensions with missing values
- Reducing the computational cost of the ECF using a nuFFT: a fast and objective probability density estimation method
- Self-Consistent Method for Density Estimation
- Title not available (Why is that?)
Cited In (1)
Uses Software
This page was built for publication: TCMI: a non-parametric mutual-dependence estimator for multivariate continuous distributions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2097447)