New parametric measures of information
From MaRDI portal
Publication:3036451
DOI10.1016/S0019-9958(81)90263-1zbMATH Open0524.62005MaRDI QIDQ3036451FDOQ3036451
Author name not available (Why is that?)
Publication date: 1981
Published in: Information and Control (Search for Journal in Brave)
unbiased estimatorinequalitiesdeterminantKullback-LeiblertraceFisher informationRenyimultidimensional parametersCsiszar measureslargest eigenvalue of covariance matrixnew parametric measures of information
Statistical aspects of information-theoretic topics (62B10) Measures of information, entropy (94A17)
Cited In (38)
- Statistical aspects of divergence measures
- ON SOME INEQUALITIES AND GENERALIZED ENTROPIES: A UNIFIED APPROAC
- Generalizations of Entropy and Information Measures
- On Two Forms of Fisher's Measure of Information
- Metricas riemanianas asociadas a M-divergencias
- Title not available (Why is that?)
- On reconsidering entropies and divergences and their cumulative counterparts: Csiszár's, DPD's and Fisher's type cumulative and survival measures
- Generalized Jensen difference divergence measures and Fisher measure of information
- Information and random censoring
- On properties of the \((\Phi , a)\)-power divergence family with applications in goodness of fit tests
- On the loss of information due to fuzziness in experimental observations
- A new family of divergence measures for tests of fit
- Generalized arithmetic and geometric mean divergence measure and their statistical aspects
- On the Fisher Information
- Some information theoretic ideas useful in statistical inference
- Tukey's linear sensitivity and order statistics
- Information in experiments and sufficiency
- Connections of generalized divergence measures with Fisher information matrix
- A symmetric information divergence measure of the Csiszár's \(f\)-divergence class and its bounds
- Limiting properties of some measures of information
- Nuevas medidas de informacion parametricas reales basadas en la matriz de Fisher
- Discretization of \((h,\varphi)\)-divergences
- Relative efficiency of a censored experiment in terms of fisher information matrix
- \((h,\Psi)\)-entropy differential metric
- Divergence statistics: sampling properties and multinomial goodness of fit and divergence tests
- Discrete approximations to the Csiszár, Renyi, and Fisher measures of information
- An information theoretic argument for the validity of the exponential model
- Studies of information quantities and information geometry of higher order cumulant spaces
- Connections between some criteria to compare fuzzy information systems
- Generalized arithmetic and geometric mean divergence measure and their statistical aspects
- Fuzziness in the experimental outcomes: Comparing experiments and removing the loss of information
- Generalized Information Criteria for the Best Logit Model
- New entropic bounds on time scales via Hermite interpolating polynomial
- Generalized divergence measures: Information matrices, amount of information, asymptotic distribution, and its applications to test statistical hypotheses
- Order preserving property of measures of information
- Lin–Wong divergence and relations on type I censored data
- Some properties of Lin–Wong divergence on the past lifetime data
- Statistical management of fuzzy elements in random experiments. II: The Fisher information associated with a fuzzy information system
This page was built for publication: New parametric measures of information
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3036451)