Binomial and Poisson distributions as maximum entropy distributions
DOI10.1109/18.930936zbMATH Open0999.94012DBLPjournals/tit/Harremoes01OpenAlexW2146094904WikidataQ57380832 ScholiaQ57380832MaRDI QIDQ4544641FDOQ4544641
Publication date: 4 August 2002
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Full work available at URL: https://semanticscholar.org/paper/592db85a665b7f9cf40eaf580147289754303576
Recommendations
- scientific article; zbMATH DE number 6303566
- Log-concavity and the maximum entropy property of the Poisson distribution
- Probability distributions and the maximum entropy principle
- Maximum entropy versus minimum risk and applications to some classical discrete distributions
- scientific article; zbMATH DE number 4007340
entropyinformation divergencegeneralized Poisson distributiongeneralized binomial distributionPoisson's law
Characterization and structure theory of statistical distributions (62E10) Measures of information, entropy (94A17)
Cited In (26)
- A novel probabilistic contrast-based complex salient object detection
- Ultra log-concavity of discrete order statistics
- Relative log-concavity and a pair of triangle inequalities
- Strong Log-concavity is Preserved by Convolution
- Convexity and robustness of the Rényi entropy
- Log-concavity and the maximum entropy property of the Poisson distribution
- Confidence-based optimisation for the newsvendor problem under binomial, Poisson and exponential demand
- Title not available (Why is that?)
- Title not available (Why is that?)
- A proof of the Shepp-Olkin entropy monotonicity conjecture
- Information in Probability: Another Information-Theoretic Proof of a Finite de Finetti Theorem
- Stochastic single-allocation hub location
- Complete monotonicity of some entropies
- Curing ill-Conditionality via Representation-Agnostic Distance-Driven Perturbations
- When the a contrario approach becomes generative
- Bernoulli sums and Rényi entropy inequalities
- On the Maximum Entropy of a Sum of Independent Discrete Random Variables
- Nonuniform bounds in the Poisson approximation with applications to informational distances. II
- Conjugate predictive distributions and generalized entropies
- Convergence of Markov chains in information divergence
- Detecting topology change via correlations and entanglement from gauge/gravity correspondence
- Title not available (Why is that?)
- Entropy and the discrete central limit theorem
- Stam inequality on \(\mathbb Z_n\)
- A note on the inflated-parameter binomial distribution
- Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures
This page was built for publication: Binomial and Poisson distributions as maximum entropy distributions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4544641)