Approximation of Information Divergences for Statistical Learning with Applications
From MaRDI portal
Publication:4558044
DOI10.1515/MS-2017-0177zbMATH Open1505.62463OpenAlexW2899232023WikidataQ129033991 ScholiaQ129033991MaRDI QIDQ4558044FDOQ4558044
Authors: M. Stehlík, Ján Somorčík, Luboš Střelec, Jaromír Antoch
Publication date: 27 November 2018
Published in: Mathematica Slovaca (Search for Journal in Brave)
Full work available at URL: http://hdl.handle.net/10533/232510
Parametric hypothesis testing (62F03) Approximations to statistical distributions (nonasymptotic) (62E17)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- On the Lambert \(w\) function
- Kullback-Leibler life time testing
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Distributions of exact tests in the exponential family
- Title not available (Why is that?)
- Exact likelihood ratio testing for homogeneity of the exponential distribution
- Testing a homogeneity of stochastic processes
- Maximum likelihood estimation for the Erlang integer parameter
- Title not available (Why is that?)
- The density of the parameter estimators when the observations are distributed exponentially
This page was built for publication: Approximation of Information Divergences for Statistical Learning with Applications
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4558044)