An analysis of first-order logics of probability (Q757340)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | An analysis of first-order logics of probability |
scientific article |
Statements
An analysis of first-order logics of probability (English)
0 references
1990
0 references
This paper is divided into three main parts: the first part concerns ``probabilities on a domain'', and corresponds to a frequency conception of probability; the second concerns probabilities of propositions, construed as sets of possible worlds, and corresponds to a subjective view of probability; and the third attempts to put these two views of probability together. The approach has much in common with the approach of \textit{F. Bachus} [Representing and reasoning with probabilistic knowledge (MIT Press, 1990)]. Probabilities on a domain: These are given in a two-sorted logic, in which the sorts of objects are ordinary objects in the domain, and objects of a field of objects representing values of probabilities. Models of this logic are triples, (D,\(\pi\),\(\mu\)), where D is a domain, \(\pi\) a standard interpretation of the predicates and operations of the language, and \(\mu\) a measure on the individuals of the domain D. The value of \(\mu (d_ i)\) represents the probability of choosing individual \(d_ i\). Why not count the members of the domain uniformly? The author gives two reasons. First, you can't have a countably additive probability measure on a countable domain. More important (since in real life, we can get along with finite domains) is the argument that a two-stage probability: where one chooses an urn, and then a ball from the urn, cannot be represented by a measure that gives to each ball equal weight. While this is true, it should be remarked that there are other, perhaps simpler, ways of dealing with these problems than by imposing a measure \(\mu\) on the domain of individuals - presumably a different one for each problem. We can achieve the same end with a uniform distribution over individuals, combined with conditionalization, as follows: take the individuals to be ordered pairs, consisting of an urn and a ball. Take 2- black to characterize ordered pairs in which the second object is black. The proportion of individuals belonging to 2-black is of course just the proportion of black balls. But if we pick an urn and then a ball from the urn, then we should conditionalize on that information: the individual (pair) we are talking about comes from a subset of our domain, namely, \(\{<x,y>:\) y is chosen from \(x\}\), and the set of 2-black objects in this domain is just what we expect. This approach has the double advantage of eliminating the need for \(\mu\) and of allowing for a general, simultaneous treatment of many problems of this form. Furthermore, if we do not need to treat urns-and-balls in the way that the author does, it is not clear what function a measure on individuals in the domain serves. It seems somehow unnatural to talk of the probability of an individual and one wonders if one could not accomplish all one's ends by talk of a relativized measure: the measure of one set in another, as the author does at the end of his discussion of probabilities on a domain. To give a semantics for that measure, say in terms of the ratios of cardinals, one might want to suppose that the reference set was finite, but its exact cardinality would be irrelevant. Leaving these points to one side, the characterization of probabilities on a domain is clear and valuable. In particular, even if we talk of relative cardinalities or measures rather than sums of probabilities of individuals, Lemma 2.3 will still be true: if \(\phi\) is a closed formula, its probability will be 0 or 1. Probabilities on possible worlds. Probabilities on closed formulas are just what we need for forming expectations and guiding choices; domain probabilities give us 0 or 1, we often know not which. The author associates a closed formula with a set of possible worlds, and gives us an interpretation which makes use of a discrete probability function over the set S of possible worlds. A type 2-probability structure has the form (D,S,\(\pi\),\(\mu\)), where D is a domain, S a set of possible worlds, \(\pi\) an interpretation function, and \(\mu\) a measure on S. We then have a model for either subjective or logical probability. The challenge, clearly, is to tie these two notions of probability together. Type-3 structures have the form \((D,S,\pi,\mu_ D,\mu_ S)\), in which we have a measure both on the domain and on the set of possible worlds. This section contains only one theorem (Theorem 4.5): If M is a type-3 structure such that all the predicate and function symbols in \(\phi\) are rigid except for the constant symbol \({\mathfrak a}\), then \[ M\quad \vDash \quad ([w(\phi ({\mathfrak a}))=w_ x(\phi (x))]\quad \equiv \quad \forall r[w(\phi ({\mathfrak a}))| (w_ x(\phi (x))=r))=r]). \] The remainder of the paper concerns completeness and undecidability, and the presentation of some axiom systems providing sound (but incomplete) axiomatizations of these structures. What is particularly noteworthy about the paper is the fact that it provides semantics for various forms of probabilistic reasoning within an object language, in a clear and explicit way.
0 references
semantics for probabilistic reasoning within an object language
0 references
probability logic
0 references
probability models
0 references
probabilities on a domain
0 references
frequency conception of probability
0 references
probabilities of propositions
0 references
possible worlds
0 references
subjective view of probability
0 references
two-sorted logic
0 references
type 2-probability structure
0 references
type-3 structure
0 references
completeness
0 references
undecidability
0 references
axiomatizations
0 references