The epic story of maximum likelihood

From MaRDI portal
Publication:449800

DOI10.1214/07-STS249zbMATH Open1246.01016arXiv0804.2996OpenAlexW2056635176MaRDI QIDQ449800FDOQ449800


Authors: Stephen M. Stigler Edit this on Wikidata


Publication date: 1 September 2012

Published in: Statistical Science (Search for Journal in Brave)

Abstract: At a superficial level, the idea of maximum likelihood must be prehistoric: early hunters and gatherers may not have used the words ``method of maximum likelihood to describe their choice of where and how to hunt and gather, but it is hard to believe they would have been surprised if their method had been described in those terms. It seems a simple, even unassailable idea: Who would rise to argue in favor of a method of minimum likelihood, or even mediocre likelihood? And yet the mathematical history of the topic shows this ``simple idea is really anything but simple. Joseph Louis Lagrange, Daniel Bernoulli, Leonard Euler, Pierre Simon Laplace and Carl Friedrich Gauss are only some of those who explored the topic, not always in ways we would sanction today. In this article, that history is reviewed from back well before Fisher to the time of Lucien Le Cam's dissertation. In the process Fisher's unpublished 1930 characterization of conditions for the consistency and efficiency of maximum likelihood estimates is presented, and the mathematical basis of his three proofs discussed. In particular, Fisher's derivation of the information inequality is seen to be derived from his work on the analysis of variance, and his later approach via estimating functions was derived from Euler's Relation for homogeneous functions. The reaction to Fisher's work is reviewed, and some lessons drawn.


Full work available at URL: https://arxiv.org/abs/0804.2996




Recommendations




Cites Work


Cited In (26)





This page was built for publication: The epic story of maximum likelihood

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q449800)