A Bayesian characterization of relative entropy

From MaRDI portal
Publication:2877683

zbMATH Open1321.94023arXiv1402.3067MaRDI QIDQ2877683FDOQ2877683


Authors: John Baez, Tobias Fritz Edit this on Wikidata


Publication date: 25 August 2014

Published in: Theory and Applications of Categories (Search for Journal in Brave)

Abstract: We give a new characterization of relative entropy, also known as the Kullback-Leibler divergence. We use a number of interesting categories related to probability theory. In particular, we consider a category FinStat where an object is a finite set equipped with a probability distribution, while a morphism is a measure-preserving function f:XoY together with a stochastic right inverse s:YoX. The function f can be thought of as a measurement process, while s provides a hypothesis about the state of the measured system given the result of a measurement. Given this data we can define the entropy of the probability distribution on X relative to the "prior" given by pushing the probability distribution on Y forwards along s. We say that s is "optimal" if these distributions agree. We show that any convex linear, lower semicontinuous functor from FinStat to the additive monoid [0,infty] which vanishes when s is optimal must be a scalar multiple of this relative entropy. Our proof is independent of all earlier characterizations, but inspired by the work of Petz.


Full work available at URL: https://arxiv.org/abs/1402.3067

File on IPFS (Hint: this is only the Hash - if you get a timeout, this file is not available on our server.)



Recommendations





Cited In (14)





This page was built for publication: A Bayesian characterization of relative entropy

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2877683)