On Bayesian learning from Bernoulli observations
From MaRDI portal
(Redirected from Publication:989280)
Abstract: We provide a reason for Bayesian updating, in the Bernoulli case, even when it is assumed that observations are independent and identically distributed with a fixed but unknown parameter . The motivation relies on the use of loss functions and asymptotics. Such a justification is important due to the recent interest and focus on Bayesian consistency which indeed assumes that the observations are independent and identically distributed rather than being conditionally independent with joint distribution depending on the choice of prior.
Recommendations
Cites work
- scientific article; zbMATH DE number 3886512 (Why is no real title available?)
- scientific article; zbMATH DE number 5770669 (Why is no real title available?)
- scientific article; zbMATH DE number 578421 (Why is no real title available?)
- scientific article; zbMATH DE number 3252891 (Why is no real title available?)
- scientific article; zbMATH DE number 3322635 (Why is no real title available?)
- Converting information into probability measures with the Kullback-Leibler divergence
- On Information and Sufficiency
- Sufficient conditions for Bayesian consistency
Cited in
(7)- scientific article; zbMATH DE number 1058071 (Why is no real title available?)
- Converting information into probability measures with the Kullback-Leibler divergence
- On Bayesian learning via loss functions
- Bayesian learning for a class of priors with prescribed marginals
- On general Bayesian inference using loss functions
- scientific article; zbMATH DE number 5530066 (Why is no real title available?)
- Bayesian Representation of Stochastic Processes under Learning: de Finetti Revisited
This page was built for publication: On Bayesian learning from Bernoulli observations
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q989280)