Information-theoretic asymptotics of Bayes methods
From MaRDI portal
Publication:3492635
DOI10.1109/18.54897zbMath0709.62008OpenAlexW2160570986MaRDI QIDQ3492635
Andrew R. Barron, Bertrand S. Clarke
Publication date: 1990
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Full work available at URL: https://semanticscholar.org/paper/52af2a10ae1eb70222cd5d0a58d02594eaf9e311
approximationFisher information matrixdensity estimationBayesian distributionsprior densitycomposite hypotheses testinguniversal data compressionrelative entropy distancestock-market portfolio selection
Related Items (82)
Discussion of ‘Prior-based Bayesian Information Criterion (PBIC)’ ⋮ Optimal Short-Term Population Coding: When Fisher Information Fails ⋮ The adaptive normal-hypergeometric-inverted-beta priors for sparse signals ⋮ A Note on the Minimax Solution for the Two-Stage Group Testing Problem ⋮ Partial information reference priors: Derivation and interpretations ⋮ Jeffreys' prior is asymptotically least favorable under entropy risk ⋮ Information optimality and Bayesian modelling ⋮ Game theory, maximum entropy, minimum discrepancy and robust Bayesian decision theory ⋮ Counting probability distributions: Differential geometry and model selection ⋮ On universal prediction and Bayesian confirmation ⋮ Universal coding for classical-quantum channel ⋮ Information tradeoff ⋮ Metabolic cost of neuronal information in an empirical stimulus-response model ⋮ Moment matching priors ⋮ On-line maximum likelihood prediction with respect to general loss functions ⋮ Normality of Posterior Distribution Under Misspecification and Nonsmoothness, and Bayes Factor for Davies' Problem ⋮ Statistical Problem Classes and Their Links to Information Theory ⋮ Schwarz, Wallace, and Rissanen: Intertwining Themes in Theories of Model Selection ⋮ Competitive On-line Statistics ⋮ Accuracy of latent-variable estimation in Bayesian semi-supervised learning ⋮ Asymptotic accuracy of Bayesian estimation for a single latent variable ⋮ Mutual information, metric entropy and cumulative relative entropy risk ⋮ A philosophical treatise of universal induction ⋮ Coincidences and estimation of entropies of random variables with large cardinalities ⋮ A Bayesian approach to (online) transfer learning: theory and algorithms ⋮ Applications of Laplace’s method in Bayesian analysis and related topics ⋮ Reference optimality criterion for planning accelerated life testing ⋮ Reproducible model selection using bagged posteriors ⋮ On divergence measures leading to Jeffreys and other reference priors ⋮ Locally adaptive Bayesian isotonic regression using half shrinkage priors ⋮ A general divergence criterion for prior selection ⋮ A Bernstein-von Mises theorem for discrete probability distributions ⋮ Reference priors for exponential families with increasing dimension ⋮ When does ambiguity fade away? ⋮ Objective priors: an introduction for frequentists ⋮ Discussion on ``Objective priors: an introduction for frequentists by M. Ghosh ⋮ Ensuring privacy with constrained additive noise by minimizing Fisher information ⋮ Mutual Information, Fisher Information, and Efficient Coding ⋮ Shannon optimal priors on independent identically distributed statistical experiments converge weakly to Jeffrey's prior ⋮ A note on the confidence properties of reference priors for the calibration model ⋮ Comparative noninformativities of quantum priors based on monotone metrics ⋮ Bernstein-von Mises theorems for Gaussian regression with increasing number of regressors ⋮ Catching up Faster by Switching Sooner: A Predictive Approach to Adaptive Estimation with an Application to the AIC–BIC Dilemma ⋮ Fluctuation-Dissipation Theorem and Models of Learning ⋮ Difficulty of Singularity in Population Coding ⋮ Stochastic complexity for mixture of exponential families in generalized variational Bayes ⋮ Asymptotic property of universal lossless coding for independent piecewise identically distributed sources ⋮ Asymptotical improvement of maximum likelihood estimators on Kullback-Leibler loss ⋮ Entropy bounds on Bayesian learning ⋮ Consistency of discrete Bayesian learning ⋮ Information-Theoretic Bounds and Approximations in Neural Population Coding ⋮ Unnamed Item ⋮ Decision theoretic generalizations of the PAC model for neural net and other learning applications ⋮ Quantifying Neurotransmission Reliability Through Metrics-Based Information Analysis ⋮ Asymptotic Theory of Information-Theoretic Experimental Design ⋮ Jeffreys versus Shtarkov distributions associated with some natural exponential families ⋮ Discrepancy risk model selection test theory for comparing possibly misspecified or nonnested models ⋮ Nonsubjective priors via predictive relative entropy regret ⋮ Universal approximation of multi-copy states and universal quantum lossless data compression ⋮ Predictability, Complexity, and Learning ⋮ An empirical study of minimum description length model selection with infinite parametric complexity ⋮ Flexible covariance estimation in graphical Gaussian models ⋮ Bayesian analysis of static and dynamic factor models: an ex-post approach towards the rotation problem ⋮ Model-Based Decoding, Information Estimation, and Change-Point Detection Techniques for Multineuron Spike Trains ⋮ A fluctuation theory of communications ⋮ Minimum message length inference of the Poisson and geometric models using heavy-tailed prior distributions ⋮ Effects of additional data on Bayesian clustering ⋮ The Newsvendor under Demand Ambiguity: Combining Data with Moment and Tail Information ⋮ Limit Theorems for φ-Divergences Based onk-Spacings ⋮ A minimum description length approach to hidden Markov models with Poisson and Gaussian emissions. Application to order identification ⋮ Post-processing for Bayesian analysis of reduced rank regression models with orthonormality restrictions ⋮ Special feature: Information theory and statistics ⋮ Price probabilities: a class of Bayesian and non-Bayesian prediction rules ⋮ Market selection in large economies: A matter of luck ⋮ Unnamed Item ⋮ Information-theoretic determination of minimax rates of convergence ⋮ Combining different procedures for adaptive regression ⋮ Reference priors for prediction ⋮ Statistical Inference, Occam's Razor, and Statistical Mechanics on the Space of Probability Distributions ⋮ Mixing strategies for density estimation. ⋮ Quantum and Fisher information from the Husimi and related distributions ⋮ Unnamed Item
This page was built for publication: Information-theoretic asymptotics of Bayes methods