Discrete approximations to the Csiszár, Renyi, and Fisher measures of information
From MaRDI portal
Publication:3026034
DOI10.2307/3315194zbMath0624.62008OpenAlexW2067976103MaRDI QIDQ3026034
No author found.
Publication date: 1986
Published in: Canadian Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.2307/3315194
discretizationFisher informationKullback-Leibler informationRenyi informationdiscrete approximationsphi-divergenceloss of informationcontinuous dataNumerical resultsCsiszar informationFisher measures of informationgrouping of the data
Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10)
Related Items
Discretization of \((h,\varphi)\)-divergences ⋮ Fisher's information matrix and φ−divergence for finite and optimal partitions of the sample space ⋮ A path integral approach to the Hodgkin-Huxley model ⋮ Causal information quantification of prominent dynamical features of biological neurons ⋮ On reconsidering entropies and divergences and their cumulative counterparts: Csiszár's, DPD's and Fisher's type cumulative and survival measures ⋮ Contrasting chaos with noise via local versus global information quantifiers ⋮ Unnamed Item ⋮ On properties of the \((\Phi , a)\)-power divergence family with applications in goodness of fit tests ⋮ On asymptotic sufficiency and optimality of quantizations ⋮ On efficiency of estimation and testing with data quantized to fixed number of cells ⋮ On Efficient Estimation in Continuous Models Based on Finitely Quantized Observations
Cites Work