On Bayesian learning from Bernoulli observations
From MaRDI portal
Publication:989280
DOI10.1016/J.JSPI.2010.05.023zbMATH Open1205.62023arXiv0902.2544OpenAlexW2095488896MaRDI QIDQ989280FDOQ989280
Authors: Pier Giovanni Bissiri, Stephen G. Walker
Publication date: 19 August 2010
Published in: Journal of Statistical Planning and Inference (Search for Journal in Brave)
Abstract: We provide a reason for Bayesian updating, in the Bernoulli case, even when it is assumed that observations are independent and identically distributed with a fixed but unknown parameter . The motivation relies on the use of loss functions and asymptotics. Such a justification is important due to the recent interest and focus on Bayesian consistency which indeed assumes that the observations are independent and identically distributed rather than being conditionally independent with joint distribution depending on the choice of prior.
Full work available at URL: https://arxiv.org/abs/0902.2544
Recommendations
Cites Work
- Title not available (Why is that?)
- On Information and Sufficiency
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Converting information into probability measures with the Kullback-Leibler divergence
- Sufficient conditions for Bayesian consistency
Cited In (7)
- Title not available (Why is that?)
- Bayesian Representation of Stochastic Processes under Learning: de Finetti Revisited
- On Bayesian learning via loss functions
- Converting information into probability measures with the Kullback-Leibler divergence
- On general Bayesian inference using loss functions
- Title not available (Why is that?)
- Bayesian learning for a class of priors with prescribed marginals
This page was built for publication: On Bayesian learning from Bernoulli observations
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q989280)