Game theory, maximum entropy, minimum discrepancy and robust Bayesian decision theory
DOI10.1214/009053604000000553zbMath1048.62008arXivmath/0410076OpenAlexW1590693676WikidataQ59631784 ScholiaQ59631784MaRDI QIDQ1879959
Publication date: 15 September 2004
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/math/0410076
minimaxmaximinrelative entropymutual informationconvexitydualitysaddle-pointexponential familyKullback-Leibler divergencespecific entropyBregman divergencescoring ruleadditive modeluncertainty functionBrier scoreredundancy-capacity theoremPythagorean propertylogarithmic scorezero-one lossGamma-minimaxBayes actequalizer rulegeneralized exponential familymean-value constraints
Minimax procedures in statistical decision theory (62C20) Applications of game theory (91A80) Other game-theoretic models (91A40) Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10)
Related Items (84)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Can the maximum entropy principle be explained as a consistency requirement?
- The constraint rule of the maximum entropy principle
- Diversity and dissimilarity coefficients: A unified approach
- Statistical decision theory and Bayesian analysis. 2nd ed
- Coding of a source with unknown but ordered probabilities
- Why least squares and maximum entropy? An axiomatic approach to inference for linear inverse problems
- Papers on probability, statistics and statistical physics. Ed. by R. D. Rosenkrantz.
- I-divergence geometry of probability distributions and minimization problems
- Jeffreys' prior is asymptotically least favorable under entropy risk
- Basic concepts, identities and inequalities -- the toolkit of information theory
- Maximum entropy fundamentals
- Coherent dispersion criteria for optimal experimental design
- An algorithm for calculating \(\Gamma\)-minimax decision rules under generalized moment conditions.
- Shannon optimal priors on independent identically distributed statistical experiments converge weakly to Jeffrey's prior
- On a Measure of the Information Provided by an Experiment
- Information Theory and Statistical Mechanics
- General entropy criteria for inverse problems, with applications to data compression, pattern classification, and cluster analysis
- Information-theoretic asymptotics of Bayes methods
- A source matching approach to finding minimax codes
- Random coding strategies for minimum entropy
- A general minimax result for relative entropy
- Asymptotic minimax regret for data compression, gambling, and prediction
- Maximum entropy versus minimum risk and applications to some classical discrete distributions
- A strong version of the redundancy-capacity theorem of universal coding
- Maximum Entropy Reconstruction Using Derivative Information, Part 1: Fisher Information and Convex Duality
- Uncertainty, Information, and Sequential Experiments
- Convex Analysis
- Γ-Minimax: A Paradigm for Conservative Robust Bayesians
- Relative loss bounds for on-line density estimation with the exponential family of distributions
This page was built for publication: Game theory, maximum entropy, minimum discrepancy and robust Bayesian decision theory