Equivalence of Gauss's Principle and Minimum Discrimination Information Estimation of Probabilities
From MaRDI portal
Publication:5594931
DOI10.1214/AOMS/1177696977zbMATH Open0198.23501OpenAlexW1964009583MaRDI QIDQ5594931FDOQ5594931
Authors: L. Lorne Campbell
Publication date: 1970
Published in: Annals of Mathematical Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1214/aoms/1177696977
Cited In (16)
- Kullback-Leibler average, consensus on probability densities, and distributed state estimation with guaranteed stability
- Assessing systemic risk due to fire sales spillover through maximum entropy network reconstruction
- A generalization of Poincaré's characterization of exponential families
- Maximum likelihood characterization of distributions
- The principle of maximum entropy
- Derivation of quantum statistics from Gauss' principle and the second law
- On variational inference and maximum likelihood estimation with the \(\lambda\)-exponential family
- A characterization of the compound multiparameter Hermite gamma distribution via Gauss's principle
- An information theoretic argument for the validity of the exponential model
- Minimum ofh-Divergeuce and Maximum Likelihood Estimator
- Joint probabilities under expected value constraints, transportation problems, maximum entropy in the mean
- Marginal probability distribution determined by the maximum entropy method.
- Some characterization theorems for generalized measures of uncertainty and information
- A note on the harmonic law: a two-parameter family of distributions for ratios
- Construction of contingency tables by maximum entropy in the mean
- Quasicyclic symmetry and the directed divergence in information theory
This page was built for publication: Equivalence of Gauss's Principle and Minimum Discrimination Information Estimation of Probabilities
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5594931)