Bayesian testing of a point null hypothesis based on the latent information prior
From MaRDI portal
Publication:280652
DOI10.3390/e15104416zbMath1403.62036OpenAlexW2092888431MaRDI QIDQ280652
Publication date: 10 May 2016
Published in: Entropy (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.3390/e15104416
predictionKullback-Leibler divergencereference priorconditional mutual informationdiscrete priorJeffreys-Lindley paradox
Uses Software
Cites Work
- Unnamed Item
- Bayesian predictive densities based on latent information priors
- On the sample information about parameter and prediction
- I-divergence geometry of probability distributions and minimization problems
- Partial information reference priors: Derivation and interpretations
- Game theory, maximum entropy, minimum discrepancy and robust Bayesian decision theory
- A STATISTICAL PARADOX
- Testing a Point Null Hypothesis: The Irreconcilability of P Values and Evidence
- Goodness of prediction fit
- A general minimax result for relative entropy
- Algorithm 680: evaluation of the complex error function
- An algorithm for computing the capacity of arbitrary discrete memoryless channels
- Computation of channel capacity and rate-distortion functions
This page was built for publication: Bayesian testing of a point null hypothesis based on the latent information prior