Probability, statistical optics, and data testing. A problem solving approach. (Q5942574)

From MaRDI portal
Revision as of 01:25, 20 March 2024 by Openalex240319060354 (talk | contribs) (Set OpenAlex properties.)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
scientific article; zbMATH DE number 1639449
Language Label Description Also known as
English
Probability, statistical optics, and data testing. A problem solving approach.
scientific article; zbMATH DE number 1639449

    Statements

    Probability, statistical optics, and data testing. A problem solving approach. (English)
    0 references
    0 references
    30 August 2001
    0 references
    Scientists in optics are increasingly confronted with problems that are of a random nature and that require a working knowledge of probability and statistics for their solution. This textbook develops these subjects within the context of optics using a problem-solving approach. The overall aim of this edition continues to be that of teaching the fundamental methods of probability and statistics. The methods are developed from first principles, and the student is motivated by solving interesting problems in optics, engineering and physics. The problems range from the simple, such as winning a state lottery jackpot or computing the probability of intelligent life in the universe, to the more complex, such as modelling the bull and bear behavior of a stock market. A new central limit theorem of optics is developed. This predicts that the sum of the position coordinates of the photons in a diffraction point spread function (PSF) follows a Cauchy probability law. Also, the output PSF of a relay system using multiply cascaded lenses obeys the Cauchy law. Of course the usual central limit theorem predicts a Gaussian law, but certain of its premises are violated in incoherent diffraction imagery. Other specifically optical topics that are newly treated are the Mandel formula of photoelectron theory and the concept of coarse graining. The topic of maximum probable estimates of optical objects is updated to further clarify the scope of applications of the maximum entropy approach. The chapter on Monte Carlo calculations is extended to include methods of generating jointly fluctuating random variables. Methods of artificially generating photon depleted, two-dimensional images are given. The treatment of functions of random variables is extended to include functions of combinations of random variables, such as quotients or products. For example, it is found that the quotient of two independent Gaussian random variables obeys a Cauchy law. A further application gives the amount by which a probability law is distorted due to viewing its event space from a relativistically moving frame. Fractal processes are included in the chapter on stochastic processes. This includes the concepts of Hausdorff dimension and self-similarity. The ideas of connectivity by association, and Erdös numbers, are also briefly treated. The subject of parameter estimation is broadened in scope, to include the Bhattacharyya bound, receiver operating characteristics and the problem of estimating multiple parameters. It is shown that systems of differential equations such as of Lotka-Volterra kind are amenable to probabilistic solutions that complement the usual analytical ones. The approach permits classical trajectories to be assigned to quantum mechanical particles. The trajectories are not those of the physicist D. Bohm, since they are constructed in an entirely different manner. The \textit{Heisenberg uncertainty principle} is independently derived from two differing viewpoints: the conventional viewpoint that the widths of a function and its Fourier transform cannot both be arbitrarily small; and a measurement viewpoint which states that the mean-square error in estimation of a parameter and the information content in the data cannot both be arbitrarily small. Interestingly, the information content is that of Fisher, and not Shannon. The uncertainty principle is so fundamental to physics that its origin in Fisher information prompts to investigate whether physics in general is based upon the concepts of measurement and Fisher information. The surprising answer is ``yes''. An alternative statement of the uncertainty principle is \textit{Hirschman's inequality}. This uses the concept of entropy instead of variances of error. It is shown that the entropy of the data and of its Fourier space cannot both be arbitrarily small. The general use of invariance principles for purposes of finding unknown probability laws is extensively discussed and applied. A simple example shows that, based upon invariance to change of units, the universal physical constants should obey a reciprocal probability law. This hypothesis is tested by use of a chi-square test, as an example of the use of this test for given data. A section is added on the diverse measures of information that are being used for advantages in the physical and biological sciences, and in engineering. Examples are the information measures of Shannon, Renyi, Fisher, Kullback-Leibler, Hellinger, Tsallis, and Gini-Simpson. The fundamental role played by the Fisher information \(I\), in particular, in deriving the Heisenberg uncertainty principle, justifies the further study of \(I\) for its mathematical and physical properties. A calculation of \(I\) for the case of correlated additive noise shows the possibility of perfect processing of data. Also, \(I\) is found to be a measure of the level of disorder of a system. It also obeys a property of additivity, and a monotonic decrease with time, \(dI/dt\leq 0\), under a wide range of conditions. The latter is a restatement of the second law of thermodynamics with \(I\) replacing the usual entropy term. These properties imply that \(I\) may be used in the development of thermodynamics from a non-entropic point of view that emphasizes measurement and estimation error in place of heat. A novel concept that arises from this point of view is a Fisher temperature of estimation error, in place of the usual Kelvin temperature of heat. A remaining property of Fisher information is its invariance to unitary transformations of whatever type (e.g. Fourier transformation). This effect is recognized to be a universal invariance principle that is obeyed by all physically valid probability laws. The universal nature of this invariance principle allows it to be used as a key element in a new knowledge-based procedure that finds unknown probability laws. This procedure is called that of ``extreme physical information'' (EPI). The EPI approach is developed out of a model of measurement that incorporates the observer into the observed phenomenon. The observer collects information about the aim of the measurement of an unknown parameter. The information is uniquely Fisher information, and an analysis of its flow from the observed phenomenon to the observer gives the EPI principle. This mathematically amounts to a Lagrangian problem whose output is the required probability law. A zeroing of the Lagrangian gives rise, as well, to a probability law, and both the extremum and zero solutions have physical significance. EPI is applied to various measurement problems, some via guided exercises, and is shown to give rise to many of the known physical effects that define probability laws: the Schrödinger wave equation, the Dirac equation, the Maxwell-Boltzmann distribution. An aspect of EPI that is an outgrowth of its thermodynamic roots is its game aspect. The EPI mathematical procedure is equivalent to the play of a zero-sum game, with information as the prize. The players are the observer and ``nature''. Nature is represented by the observed phenomena. Both players choose optimal strategies, and the payoff of the game is the unknown probability law. In that \(dI/dt\leq 0\), the observer always loses the game. However, he gains perfect knowledge of the phenomenon's probability law. As an example, the game aspect is discussed as giving rise to the Higgs mass phenomenon. Here the information prize is the acquisition of mass by one of two reactant particles at the expense of the field energy of the other. This is thought to be the way mass originates in the universe. The book is exclusively wealthy in contents. The author generously shares his reflections and assumptions with the reader and puts the unsolved problems yet, what makes this book an especially interesting one.
    0 references
    axiomatic approach
    0 references
    Fourier methods in probability
    0 references
    Bernoulli trials
    0 references
    central limit theorem of optics
    0 references
    Monte Carlo calculations
    0 references
    stochastic processes
    0 references
    statistical optics
    0 references
    parametric and nonparametric estimations
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references