Information-theoretic asymptotics of Bayes methods

From MaRDI portal
Revision as of 23:24, 4 February 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:3492635

DOI10.1109/18.54897zbMath0709.62008OpenAlexW2160570986MaRDI QIDQ3492635

Andrew R. Barron, Bertrand S. Clarke

Publication date: 1990

Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)

Full work available at URL: https://semanticscholar.org/paper/52af2a10ae1eb70222cd5d0a58d02594eaf9e311






Related Items (82)

Discussion of ‘Prior-based Bayesian Information Criterion (PBIC)’Optimal Short-Term Population Coding: When Fisher Information FailsThe adaptive normal-hypergeometric-inverted-beta priors for sparse signalsA Note on the Minimax Solution for the Two-Stage Group Testing ProblemPartial information reference priors: Derivation and interpretationsJeffreys' prior is asymptotically least favorable under entropy riskInformation optimality and Bayesian modellingGame theory, maximum entropy, minimum discrepancy and robust Bayesian decision theoryCounting probability distributions: Differential geometry and model selectionOn universal prediction and Bayesian confirmationUniversal coding for classical-quantum channelInformation tradeoffMetabolic cost of neuronal information in an empirical stimulus-response modelMoment matching priorsOn-line maximum likelihood prediction with respect to general loss functionsNormality of Posterior Distribution Under Misspecification and Nonsmoothness, and Bayes Factor for Davies' ProblemStatistical Problem Classes and Their Links to Information TheorySchwarz, Wallace, and Rissanen: Intertwining Themes in Theories of Model SelectionCompetitive On-line StatisticsAccuracy of latent-variable estimation in Bayesian semi-supervised learningAsymptotic accuracy of Bayesian estimation for a single latent variableMutual information, metric entropy and cumulative relative entropy riskA philosophical treatise of universal inductionCoincidences and estimation of entropies of random variables with large cardinalitiesA Bayesian approach to (online) transfer learning: theory and algorithmsApplications of Laplace’s method in Bayesian analysis and related topicsReference optimality criterion for planning accelerated life testingReproducible model selection using bagged posteriorsOn divergence measures leading to Jeffreys and other reference priorsLocally adaptive Bayesian isotonic regression using half shrinkage priorsA general divergence criterion for prior selectionA Bernstein-von Mises theorem for discrete probability distributionsReference priors for exponential families with increasing dimensionWhen does ambiguity fade away?Objective priors: an introduction for frequentistsDiscussion on ``Objective priors: an introduction for frequentists by M. GhoshEnsuring privacy with constrained additive noise by minimizing Fisher informationMutual Information, Fisher Information, and Efficient CodingShannon optimal priors on independent identically distributed statistical experiments converge weakly to Jeffrey's priorA note on the confidence properties of reference priors for the calibration modelComparative noninformativities of quantum priors based on monotone metricsBernstein-von Mises theorems for Gaussian regression with increasing number of regressorsCatching up Faster by Switching Sooner: A Predictive Approach to Adaptive Estimation with an Application to the AIC–BIC DilemmaFluctuation-Dissipation Theorem and Models of LearningDifficulty of Singularity in Population CodingStochastic complexity for mixture of exponential families in generalized variational BayesAsymptotic property of universal lossless coding for independent piecewise identically distributed sourcesAsymptotical improvement of maximum likelihood estimators on Kullback-Leibler lossEntropy bounds on Bayesian learningConsistency of discrete Bayesian learningInformation-Theoretic Bounds and Approximations in Neural Population CodingUnnamed ItemDecision theoretic generalizations of the PAC model for neural net and other learning applicationsQuantifying Neurotransmission Reliability Through Metrics-Based Information AnalysisAsymptotic Theory of Information-Theoretic Experimental DesignJeffreys versus Shtarkov distributions associated with some natural exponential familiesDiscrepancy risk model selection test theory for comparing possibly misspecified or nonnested modelsNonsubjective priors via predictive relative entropy regretUniversal approximation of multi-copy states and universal quantum lossless data compressionPredictability, Complexity, and LearningAn empirical study of minimum description length model selection with infinite parametric complexityFlexible covariance estimation in graphical Gaussian modelsBayesian analysis of static and dynamic factor models: an ex-post approach towards the rotation problemModel-Based Decoding, Information Estimation, and Change-Point Detection Techniques for Multineuron Spike TrainsA fluctuation theory of communicationsMinimum message length inference of the Poisson and geometric models using heavy-tailed prior distributionsEffects of additional data on Bayesian clusteringThe Newsvendor under Demand Ambiguity: Combining Data with Moment and Tail InformationLimit Theorems for φ-Divergences Based onk-SpacingsA minimum description length approach to hidden Markov models with Poisson and Gaussian emissions. Application to order identificationPost-processing for Bayesian analysis of reduced rank regression models with orthonormality restrictionsSpecial feature: Information theory and statisticsPrice probabilities: a class of Bayesian and non-Bayesian prediction rulesMarket selection in large economies: A matter of luckUnnamed ItemInformation-theoretic determination of minimax rates of convergenceCombining different procedures for adaptive regressionReference priors for predictionStatistical Inference, Occam's Razor, and Statistical Mechanics on the Space of Probability DistributionsMixing strategies for density estimation.Quantum and Fisher information from the Husimi and related distributionsUnnamed Item







This page was built for publication: Information-theoretic asymptotics of Bayes methods