A history of parametric statistical inference from Bernoulli to Fisher, 1713--1935 (Q860330)

From MaRDI portal
scientific article
Language Label Description Also known as
English
A history of parametric statistical inference from Bernoulli to Fisher, 1713--1935
scientific article

    Statements

    A history of parametric statistical inference from Bernoulli to Fisher, 1713--1935 (English)
    0 references
    0 references
    9 January 2007
    0 references
    Hald directs his readers ``for more proofs, references and information on related topics'' to his previous books [History of probability and statistics and their applications before 1750. Chichester etc.: John Wiley \& Sons Ltd. (1990; Zbl 0731.01001); History of mathematical statistics from 1750 to 1930. New York, NY: John Wiley \& Sons. (1998; Zbl 0979.01012)] and tells us that he borrowed about 50 pages from the second one. It is difficult to say what is essentially new, but at least it is only now possible to see at once what was contained in a certain memoir of Laplace (say). As always, Hald's exposition is on a high level and I doubt that it will be an ``easy'' reading for those who attended an ``elementary course in probability and statistics''. He concentrates on three ``revolutions'' in parametric statistical inference: Laplace, early memoirs; Laplace and Gauss, 1809--1828; and Fisher, 1912--1956 (note the closing date 1935 on the title!). I take issue on many points. Jakob Bernoulli's classic did not become a ``great inspiration'' for statisticians (p. 14) until the turn of the 19th century. The cosine error distribution (p. 2) was one of the ``most important''? Introduced by Lagrange, it was hardly ever applied. The statement (p. 4) that in 1799 the ``problem of the arithmetic mean'' was still unsolved, ought to be softened by mentioning the appropriate studies by Simpson and Lagrange. The integral of the exponential function of the negative square between infinite limits was first calculated by Euler rather than Laplace (pp. 38, 58). Legendre's memoir was neither clear nor concise (p. 53); he all but stated that the method of least squares (MLSq) provided the least interval of the possible errors, and he mentioned errors instead of residuals. In 1818 Bessel had indeed stated that observational errors were almost normal (pp. 58, 98), but in 1838 he dropped his reservation and provided a patently wrong explanation for the deviation from normality. Actually, he developed a happy-go-lucky trait, see my note ``Bessel: some remarks on his work''. Hist. Sci. 10, 77--83 (2) (2000). That Gauss, in 1809, had applied inverse probability (pp. 57, 58), is true, but Whittaker and Robinson, 1924, noted that this was already implied by the postulate of the mean. Two differing causes why Gauss abandoned his first justification of the MLSq (pp. 56 and 101) are both wrong. Much is reasonably said about Laplace's application of the central limit theorem, but its non-rigorous proof is left over in silence. The Bibliography does not mention the Collected Works of Edgeworth, 1996, or the reprints of Poisson, 1837, Todhunter, 1865 or of K. Pearson's ``Grammar of Science'' after 1911 (JFM 42.0072.01). Missing are Montmort, 1713 (although referred to!), Gauss' collected German contributions on the MLSq, and Cramér, 1946, as well as the Dict. Scient. Biogr., the Encyclopedia of Statistical Sciences, and Prokhorov, Yu. V., ed., Veroyatnosti Matematicheskaya Stistika. Entsiklopedia (Probability and Mathematical Statistic An Encyclopedia). Moscow (1999). The books Porter, 1986, and Maistrov, 1974 are included, but the reviewer's ``Theory of Probability. Hist. Essay.'' Berlin (2005), also at www.sheynin.de, is not.
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    inverse probability
    0 references
    principle of maximum likelihood
    0 references
    0 references