Equivalence of Gauss's Principle and Minimum Discrimination Information Estimation of Probabilities
From MaRDI portal
Publication:5594931
Cited in
(16)- On variational inference and maximum likelihood estimation with the \(\lambda\)-exponential family
- Quasicyclic symmetry and the directed divergence in information theory
- An information theoretic argument for the validity of the exponential model
- Some characterization theorems for generalized measures of uncertainty and information
- A characterization of the compound multiparameter Hermite gamma distribution via Gauss's principle
- Joint probabilities under expected value constraints, transportation problems, maximum entropy in the mean
- Maximum likelihood characterization of distributions
- A generalization of Poincaré's characterization of exponential families
- The principle of maximum entropy
- Assessing systemic risk due to fire sales spillover through maximum entropy network reconstruction
- Marginal probability distribution determined by the maximum entropy method.
- Construction of contingency tables by maximum entropy in the mean
- Kullback-Leibler average, consensus on probability densities, and distributed state estimation with guaranteed stability
- Minimum ofh-Divergeuce and Maximum Likelihood Estimator
- A note on the harmonic law: a two-parameter family of distributions for ratios
- Derivation of quantum statistics from Gauss' principle and the second law
This page was built for publication: Equivalence of Gauss's Principle and Minimum Discrimination Information Estimation of Probabilities
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5594931)