Maximization of the information divergence from multinomial distributions
From MaRDI portal
Publication:3171430
zbMATH Open1356.62007MaRDI QIDQ3171430FDOQ3171430
Authors: Jozef Juríček
Publication date: 5 October 2011
Full work available at URL: http://hdl.handle.net/10338.dmlcz/143665
Recommendations
- Optimality conditions for maximizers of the information divergence from an exponent family
- Maximizing multi-information
- Maximal Information Divergence from Statistical Models Defined by Neural Networks
- Factorized mutual information maximization.
- Binomial and Poisson distributions as maximum entropy distributions
hierarchical modelsmultinomial distributionexponential familyrelative entropyinformation projectionmulti-informationInformation divergence
Statistical aspects of information-theoretic topics (62B10) Learning and adaptive systems in artificial intelligence (68T05)
Cited In (10)
- The Pólya information divergence
- Maximizing the divergence from a hierarchical model of quantum states
- Converting information into probability measures with the Kullback-Leibler divergence
- Factorized mutual information maximization.
- Maximal Information Divergence from Statistical Models Defined by Neural Networks
- Maximizing multi-information
- An information-geometric approach to a theory of pragmatic structuring
- Optimally approximating exponential families
- Title not available (Why is that?)
- Optimality conditions for maximizers of the information divergence from an exponent family
This page was built for publication: Maximization of the information divergence from multinomial distributions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3171430)