Cumulative and relative cumulative residual information generating measures and associated properties
DOI10.1080/03610926.2021.2005100OpenAlexW3217807237MaRDI QIDQ6170101FDOQ6170101
Authors: Omid Kharazmi, Narayanaswamy Balakrishnan
Publication date: 12 July 2023
Published in: Communications in Statistics: Theory and Methods (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/03610926.2021.2005100
Kullback-Leibler divergencesurvival functioninformation generating functionJensen-Shannon entropyJensen-Gini mean differenceJensen-information generating function
Cites Work
- On Information and Sufficiency
- A Mathematical Theory of Communication
- Stochastic orders
- Title not available (Why is that?)
- On the dynamic cumulative residual entropy
- Cumulative Residual Entropy: A New Measure of Information
- Testing goodness-of-fit for exponential distribution based on cumulative residual entropy
- Entropy of type \((\alpha,\beta)\) and other generalized measures in information theory
- A Jensen-Gini measure of divergence with application in parameter estimation
- The relative information generating function
- Non-parametric inference for Gini covariance and its variants
- Connections of Gini, Fisher, and Shannon by Bayes risk under proportional hazards
- Fractional cumulative residual entropy
- Cumulative Residual and Relative Cumulative Residual Fisher Information and Their Properties
- Jensen-information generating function and its connections to some well-known information measures
Cited In (2)
This page was built for publication: Cumulative and relative cumulative residual information generating measures and associated properties
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6170101)