Efficient feature selection using shrinkage estimators
From MaRDI portal
Publication:2320553
Recommendations
- Evaluating and selecting features via information theoretic lower bounds of feature inner correlations for high-dimensional data
- Conditional likelihood maximisation: a unifying framework for information theoretic feature selection
- Feature Selection for Classificatory Analysis Based on Information-theoretic Criteria
- Can high-order dependencies improve mutual information based feature selection?
- Jointly informative feature selection made tractable by Gaussian modeling
Cites work
- scientific article; zbMATH DE number 5957250 (Why is no real title available?)
- scientific article; zbMATH DE number 992993 (Why is no real title available?)
- scientific article; zbMATH DE number 6107964 (Why is no real title available?)
- scientific article; zbMATH DE number 6122810 (Why is no real title available?)
- 10.1162/153244303322753616
- 10.1162/153244303322753670
- Bayesian and quasi-Bayesian estimators for mutual information from discrete data
- Bayesian inference for categorical data analysis
- Bayesian network structure learning with permutation tests
- Can high-order dependencies improve mutual information based feature selection?
- Conditional likelihood maximisation: a unifying framework for information theoretic feature selection
- Dealing with under-reported variables: an information theoretic solution
- Elements of Information Theory
- Entropy inference and the James-Stein estimator, with application to nonlinear gene association networks
- Estimation of Entropy and Mutual Information
- Estimation with quadratic loss.
- Local causal and Markov blanket induction for causal discovery and feature selection for classification. Part I: Algorithms and empirical evaluation
- Simple strategies for semi-supervised feature selection
- Some data analyses using mutual information
- Statistical significance of combinatorial regulations
Cited in
(8)- scientific article; zbMATH DE number 2042277 (Why is no real title available?)
- Optimizing the relevance-redundancy tradeoff for efficient semantic segmentation
- Feature Selection for Ridge Regression with Provable Guarantees
- Correction to: ``Efficient feature selection using shrinkage estimators
- Feature selection based on fuzzy joint mutual information maximization
- Least loss: a simplified filter method for feature selection
- Squared error-based shrinkage estimators of discrete probabilities and their application to variable selection
- Feature selection and threshold method based on fuzzy joint mutual information
This page was built for publication: Efficient feature selection using shrinkage estimators
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2320553)