Entropy inference and the James-Stein estimator, with application to nonlinear gene association networks
zbMATH Open1235.62006arXiv0811.3579MaRDI QIDQ2880934FDOQ2880934
Authors: Jean Hausser, Korbinian Strimmer
Publication date: 17 April 2012
Published in: Journal of Machine Learning Research (JMLR) (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/0811.3579
Recommendations
Statistical aspects of information-theoretic topics (62B10) Applications of statistics to biology and medical sciences; meta analysis (62P10) Genetics and epigenetics (92D10) Systems biology, networks (92C42)
Cited In (14)
- A model-free Bayesian classifier
- Efficient feature selection using shrinkage estimators
- Feature selection in omics prediction problems using cat scores and false nondiscovery rate control
- Fingerprinting and reconstruction of functionals of discrete time Markov chains
- Power transformations of relative count data as a shrinkage problem
- Adversarial orthogonal regression: two non-linear regressions for causal inference
- On the permutation entropy Bayesian estimation
- Maximum relevance minimum common redundancy feature selection for nonlinear data
- Empirical Bayes predictive densities for high-dimensional normal models
- Validity limits of the maximum entropy method
- An improved estimator of Shannon entropy with applications to systems with memory
- Squared error-based shrinkage estimators of discrete probabilities and their application to variable selection
- Good-Turing frequency estimation in a finite population
- Dimensionality reduction by feature clustering for regression problems
Uses Software
This page was built for publication: Entropy inference and the James-Stein estimator, with application to nonlinear gene association networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2880934)