On testing hypotheses with divergence statistics
From MaRDI portal
Publication:4337136
DOI10.1080/03610929608831710zbMath0875.62031MaRDI QIDQ4337136
Publication date: 19 May 1997
Published in: Communications in Statistics - Theory and Methods (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/03610929608831710
entropy; asymptotic distribution; maximum likelihood estimators; testing statistical hypotheses; divergence statistics
62E20: Asymptotic distribution theory in statistics
62F03: Parametric hypothesis testing
62B10: Statistical aspects of information-theoretic topics
Related Items
Cites Work
- Unnamed Item
- A Mathematical Theory of Communication
- Entropy differential metric, distance and divergence measures in probability spaces: A unified approach
- Trigonometric entropies, Jensen difference divergence measures, and error bounds
- Entropy of type \((\alpha,\beta)\) and other generalized measures in information theory
- On the convexity of some divergence measures based on entropy functions
- Charakterisierung der Entropien positiver Ordnung und der shannonschen Entropie
- Information radius
- Information-theoretical considerations on estimation problems