Converting information into probability measures with the Kullback-Leibler divergence
DOI10.1007/S10463-012-0350-4zbMATH Open1253.62005OpenAlexW2065760629MaRDI QIDQ1925991FDOQ1925991
Stephen G. Walker, Pier Giovanni Bissiri
Publication date: 27 December 2012
Published in: Annals of the Institute of Statistical Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10463-012-0350-4
Statistical aspects of information-theoretic topics (62B10) Bayesian problems; characterization of Bayes procedures (62C10) General considerations in statistical decision theory (62C05)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Expected information as ecpected utility
- Statistical decision theory. Foundations, concepts, and methods
- Remarks on the measurement of subjective probability and information
- $\alpha$-Divergence Is Unique, Belonging to Both $f$-Divergence and Bregman Divergence Classes
- On Bayesian learning from Bernoulli observations
Cited In (8)
- Bayesian inference with misspecified models
- Approximate models and robust decisions
- Kullback-Leibler information measure for studying convergence rates of densities and distributions
- On Bayesian learning via loss functions
- On Bayesian learning from Bernoulli observations
- The Kullback–Leibler Divergence Rate Between Markov Sources
- On general Bayesian inference using loss functions
- Title not available (Why is that?)
Recommendations
- Title not available (Why is that?) 👍 👎
- Rényi Divergence and Kullback-Leibler Divergence 👍 👎
- Kullback-Leibler informational measure in the distribution density estimation problem 👍 👎
- A generalization of the Kullback–Leibler divergence and its properties 👍 👎
- Kullback-Leibler information measure for studying convergence rates of densities and distributions 👍 👎
- Kullback-leibler information and interval estimation 👍 👎
- On information gain, Kullback-Leibler divergence, entropy production and the involution kernel 👍 👎
- Normalized information-based divergences 👍 👎
This page was built for publication: Converting information into probability measures with the Kullback-Leibler divergence
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1925991)