Publication:4998962
From MaRDI portal
Mariusz Kubkowski, Jan Mielniczuk, Paweł Teisseyre
Publication date: 9 July 2021
Full work available at URL: https://jmlr.csail.mit.edu/papers/v22/19-600.html
mutual information; feature selection; interaction information; Möbius representation; conditional independence tests; information-based selection criteria
68T05: Learning and adaptive systems in artificial intelligence
Related Items
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Likelihood Ratio Tests for Model Selection and Non-Nested Hypotheses
- The Hardness of Conditional Independence Testing and the Generalised Covariance Measure
- A consistent characteristic function-based test for conditional independence
- Controlling the false discovery rate via knockoffs
- Towards scalable and data efficient learning of Markov boundaries
- Causation, prediction, and search
- Can high-order dependencies improve mutual information based feature selection?
- Testing conditional independence via empirical likelihood
- Multivariante information transmission
- Efficient Markov Network Structure Discovery Using Independence Tests
- Multiple mutual informations and multiple interactions in frequency data
- Mathematical Statistics
- CHARACTERISTIC FUNCTION BASED TESTING FOR CONDITIONAL INDEPENDENCE: A NONPARAMETRIC REGRESSION APPROACH
- Active sets of predictors for misspecified logistic regression
- Forward-Backward Selection with Early Dropping
- Panning for Gold: ‘Model-X’ Knockoffs for High Dimensional Controlled Variable Selection
- Elements of Information Theory
- On the Amount of Information
- [https://portal.mardi4nfdi.de/wiki/Publication:5731810 On the foundations of combinatorial theory I. Theory of M�bius Functions]
- Approximate and Asymptotic Distributions of Chi-Squared–Type Mixtures With Applications
- A Comparison of Alternative Tests of Significance for the Problem of $m$ Rankings