Information optimality and Bayesian modelling
From MaRDI portal
Recommendations
Cites work
- scientific article; zbMATH DE number 3872359 (Why is no real title available?)
- scientific article; zbMATH DE number 4095371 (Why is no real title available?)
- scientific article; zbMATH DE number 3667770 (Why is no real title available?)
- scientific article; zbMATH DE number 107482 (Why is no real title available?)
- scientific article; zbMATH DE number 1247156 (Why is no real title available?)
- scientific article; zbMATH DE number 578421 (Why is no real title available?)
- scientific article; zbMATH DE number 1396169 (Why is no real title available?)
- A minimally informative likelihood for decision analysis: Illustration and robustness
- An algorithm for computing the capacity of arbitrary discrete memoryless channels
- An information criterion for likelihood selection
- Asymptotic Inference for Mixture Models by Using Data-Dependent Priors
- Asymptotic minimax regret for data compression, gambling, and prediction
- Bayes' Method for Bookies
- Capturing the Intangible Concept of Information
- Computation of channel capacity and rate-distortion functions
- Density estimation by stochastic complexity
- Estimating a Product of Means: Bayesian Analysis with Reference Priors
- Fisher information and stochastic complexity
- I-divergence geometry of probability distributions and minimization problems
- Information Distinguishability with Application to Analysis of Failure Data
- Information-theoretic asymptotics of Bayes methods
- Jeffreys' prior is asymptotically least favorable under entropy risk
- Minimax redundancy for the class of memoryless sources
- Minimum complexity density estimation
- Model Selection Using the Minimum Description Length Principle
- Model Selection and the Principle of Minimum Description Length
- On a Measure of the Information Provided by an Experiment
- Partial information reference priors: Derivation and interpretations
- Reference priors with partial information
- The minimum description length principle in coding and modeling
- Universal coding, information, prediction, and estimation
Cited in
(16)- Interpreting uninterpretable predictors: kernel methods, Shtarkov solutions, and random forests
- Information and the dispersion of posterior expectations
- Role of information in classical and Bayesian modelling
- Using the Bayesian Shtarkov solution for predictions
- Asymptotically minimax Bayesian predictive densities for multinomial models
- A frequentist framework of inductive reasoning
- A remark on the maximum entropy principle in uncertainty theory
- A further note on Bayesian information topologies
- Information-theoretic asymptotics of Bayes methods
- Bayesian or Laplacien inference, entropy and information theory and information geometry in data and signal processing
- Logical and geometric inquiry
- Statistical problem classes and their links to information theory
- Incorporating prior information when true priors are unknown: an information-theoretic approach for increasing efficiency in estimation
- Eliciting vague but proper maximal entropy priors in Bayesian experiments
- Generalized information criteria for Bayes decisions
- Kullback-leibler information approach to the optimum measurement point for bayesian estimation
This page was built for publication: Information optimality and Bayesian modelling
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q280206)