Converting information into probability measures with the Kullback-Leibler divergence
From MaRDI portal
Publication:1925991
DOI10.1007/s10463-012-0350-4zbMath1253.62005MaRDI QIDQ1925991
Pier Giovanni Bissiri, Stephen G. Walker
Publication date: 27 December 2012
Published in: Annals of the Institute of Statistical Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10463-012-0350-4
62C10: Bayesian problems; characterization of Bayes procedures
62B10: Statistical aspects of information-theoretic topics
62C05: General considerations in statistical decision theory
Related Items
Bayesian inference with misspecified models, On Bayesian learning via loss functions, On Bayesian learning from Bernoulli observations, Approximate models and robust decisions, On general Bayesian inference using loss functions
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On Bayesian learning from Bernoulli observations
- Statistical decision theory. Foundations, concepts, and methods
- Expected information as ecpected utility
- Remarks on the measurement of subjective probability and information
- $\alpha$-Divergence Is Unique, Belonging to Both $f$-Divergence and Bregman Divergence Classes