Maximization of the information divergence from multinomial distributions
From MaRDI portal
Publication:3171430
Recommendations
- Optimality conditions for maximizers of the information divergence from an exponent family
- Maximizing multi-information
- Maximal Information Divergence from Statistical Models Defined by Neural Networks
- Factorized mutual information maximization.
- Binomial and Poisson distributions as maximum entropy distributions
Cited in
(10)- Converting information into probability measures with the Kullback-Leibler divergence
- An information-geometric approach to a theory of pragmatic structuring
- Maximal Information Divergence from Statistical Models Defined by Neural Networks
- Optimally approximating exponential families
- Factorized mutual information maximization.
- Maximizing the divergence from a hierarchical model of quantum states
- The Pólya information divergence
- Optimality conditions for maximizers of the information divergence from an exponent family
- scientific article; zbMATH DE number 933211 (Why is no real title available?)
- Maximizing multi-information
This page was built for publication: Maximization of the information divergence from multinomial distributions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3171430)