Pages that link to "Item:Q3036451"
From MaRDI portal
The following pages link to New parametric measures of information (Q3036451):
Displaying 38 items.
- On the loss of information due to fuzziness in experimental observations (Q583751) (← links)
- Studies of information quantities and information geometry of higher order cumulant spaces (Q716255) (← links)
- Connections between some criteria to compare fuzzy information systems (Q918519) (← links)
- Statistical aspects of divergence measures (Q1096987) (← links)
- Information in experiments and sufficiency (Q1168010) (← links)
- Fuzziness in the experimental outcomes: Comparing experiments and removing the loss of information (Q1193805) (← links)
- \((h,\Psi)\)-entropy differential metric (Q1265614) (← links)
- Connections of generalized divergence measures with Fisher information matrix (Q1310926) (← links)
- Discretization of \((h,\varphi)\)-divergences (Q1328507) (← links)
- An information theoretic argument for the validity of the exponential model (Q1337192) (← links)
- Generalized divergence measures: Information matrices, amount of information, asymptotic distribution, and its applications to test statistical hypotheses (Q1358818) (← links)
- Information and random censoring (Q1373385) (← links)
- Statistical management of fuzzy elements in random experiments. II: The Fisher information associated with a fuzzy information system (Q1803198) (← links)
- Tukey's linear sensitivity and order statistics (Q1895428) (← links)
- New entropic bounds on time scales via Hermite interpolating polynomial (Q2073003) (← links)
- On properties of the \((\Phi , a)\)-power divergence family with applications in goodness of fit tests (Q2276428) (← links)
- Limiting properties of some measures of information (Q2277693) (← links)
- A symmetric information divergence measure of the Csiszár's \(f\)-divergence class and its bounds (Q2485436) (← links)
- Some information theoretic ideas useful in statistical inference (Q2644303) (← links)
- Generalizations of Entropy and Information Measures (Q2790447) (← links)
- A NEW FAMILY OF DIVERGENCE MEASURES FOR TESTS OF FIT (Q2810420) (← links)
- Discrete approximations to the Csiszár, Renyi, and Fisher measures of information (Q3026034) (← links)
- Divergence statistics: sampling properties and multinomial goodness of fit and divergence tests (Q3212105) (← links)
- Nuevas medidas de informacion parametricas reales basadas en la matriz de Fisher (Q3357276) (← links)
- Generalized Information Criteria for the Best Logit Model (Q3459680) (← links)
- Order preserving property of measures of information (Q3472964) (← links)
- ON SOME INEQUALITIES AND GENERALIZED ENTROPIES: A UNIFIED APPROAC (Q3728800) (← links)
- Metricas riemanianas asociadas a M-divergencias (Q3988014) (← links)
- Generalized Jensen difference divergence measures and Fisher measure of information (Q4246460) (← links)
- On the Fisher Information (Q4272561) (← links)
- (Q4846590) (← links)
- Relative efficiency of a censored experiment in terms of fisher information matrix (Q4883438) (← links)
- Lin–Wong divergence and relations on type I censored data (Q5076938) (← links)
- Some properties of Lin–Wong divergence on the past lifetime data (Q5160271) (← links)
- On Two Forms of Fisher's Measure of Information (Q5314578) (← links)
- Generalized arithmetic and geometric mean divergence measure and their statistical aspects (Q5441824) (← links)
- Generalized arithmetic and geometric mean divergence measure and their statistical aspects (Q5756400) (← links)
- On reconsidering entropies and divergences and their cumulative counterparts: Csiszár's, DPD's and Fisher's type cumulative and survival measures (Q6162800) (← links)