Objective Bayesianism and the maximum entropy principle (Q280547)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Objective Bayesianism and the maximum entropy principle |
scientific article |
Statements
Objective Bayesianism and the maximum entropy principle (English)
0 references
10 May 2016
0 references
Summary: Objective Bayesian epistemology invokes three norms: the strengths of our beliefs should be probabilities; they should be calibrated to our evidence of physical probabilities; and they should otherwise equivocate sufficiently between the basic propositions that we can express. The three norms are sometimes explicated by appealing to the maximum entropy principle, which says that a belief function should be a probability function, from all those that are calibrated to evidence, that has maximum entropy. However, the three norms of objective Bayesianism are usually justified in different ways. In this paper, we show that the three norms can all be subsumed under a single justification in terms of minimising worst-case expected loss. This, in turn, is equivalent to maximising a generalised notion of entropy. We suggest that requiring language invariance, in addition to minimising worst-case expected loss, motivates maximisation of standard entropy as opposed to maximisation of other instances of generalised entropy. Our argument also provides a qualified justification for updating degrees of belief by Bayesian conditionalisation. However, conditional probabilities play a less central part in the objective Bayesian account than they do under the subjective view of Bayesianism, leading to a reduced role for Bayes' theorem.
0 references
objective Bayesianism
0 references
\(g\)-entropy
0 references
generalised entropy
0 references
Bayesian conditionalisation
0 references
scoring rule
0 references
maximum entropy
0 references
maxent
0 references
minimax
0 references