Non-parametric estimation of Kullback-Leibler discrimination information based on censored data
DOI10.1016/j.spl.2019.06.007zbMath1428.62431OpenAlexW2959696096WikidataQ127519601 ScholiaQ127519601MaRDI QIDQ2273700
Abdul Sathar E. I., Viswakala K. V.
Publication date: 25 September 2019
Published in: Statistics \& Probability Letters (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.spl.2019.06.007
kernel density estimationnonparametric estimationmean squared error (MSE)Kullback-Leibler discrimination informationmean integrated squared error (MISE)
Density estimation (62G07) Censored data models (62N01) Statistical aspects of information-theoretic topics (62B10)
Cites Work
- Unnamed Item
- Unnamed Item
- Kullback-Leibler divergence: a quantile approach
- Reliability estimation in Lindley distribution with progressively type II right censored sample
- On cumulative residual Kullback-Leibler information
- Lindley distribution and its application
- On Kullback-Leibler loss and density estimation
- On measures of information and their characterizations
- Kernel density and hazard rate estimation for censored dependent data
- Asymptotic properties of Kaplan-Meier estimator for censored dependent data
- A measure of discrimination between two residual life-time distributions and its applications
- Testing Goodness-of-Fit for Exponential Distribution Based on Cumulative Residual Entropy
- A CENTRAL LIMIT THEOREM AND A STRONG MIXING CONDITION
- A characterisation of the proportional hazards model through a measure of discrimination between two residual life distributions
- KERNEL ESTIMATION OF RESIDUAL ENTROPY
- Estimation of inaccuracy measure for censored dependent data
- Some properties and applications of cumulative Kullback–Leibler information
- Quantile-based cumulative Kullback–Leibler divergence
- On Information and Sufficiency
This page was built for publication: Non-parametric estimation of Kullback-Leibler discrimination information based on censored data