Information Distinguishability with Application to Analysis of Failure Data
DOI10.2307/2291079zbMATH Open0828.62006OpenAlexW4251504300MaRDI QIDQ4844201FDOQ4844201
Ehsan S. Soofi, Mohamed Habibullah, Nader Ebrahimi
Publication date: 2 January 1996
Full work available at URL: https://doi.org/10.2307/2291079
Recommendations
- scientific article; zbMATH DE number 3923844
- scientific article; zbMATH DE number 6118037
- Distributed majoration of information with detection and identification of fault manifestations
- Distinguishing distributions using Chernoff information
- Information theory and the failure time of a system
- Interpretable fault detection using projections of mutual information matrix
- Reliability modelling with information measures
- scientific article; zbMATH DE number 7712511
cross-entropyreliabilitymaximum entropyrelative entropylifetime distributionsdiagnosticsentropy estimatesentropy differencefailure dataequivalence resultinformation discrepancydiscrimination information statisticsinformation discrimination distinguishabilityKullback-Leibler functionsrobustness of parametric proceduressensitivity of nonparametric statistics
Statistical aspects of information-theoretic topics (62B10) Nonparametric estimation (62G05) Reliability and life testing (62N05) Nonparametric inference (62G99)
Cited In (31)
- Similarity of interspike interval distributions and information gain in a stationary neuronal firing
- Statistical Problem Classes and Their Links to Information Theory
- An efficient correction to the density-based empirical likelihood ratio goodness-of-fit test for the inverse Gaussian distribution
- An entropic structure in capability indices
- A minimally informative likelihood for decision analysis: Illustration and robustness
- Information measures in perspective
- Extropy: Characterizations and dynamic versions
- Information optimality and Bayesian modelling
- Information measures for generalized gamma family
- Maximum entropy distributions with quantile information
- Joint monitoring using information theoretic control charts
- A test of goodness-of-fit based on Gini's index of spacings
- Information indices: Unification and applications.
- A maximum entropy characterization of symmetric Kotz type and Burr multivariate distribu\-tions
- Calculation of maximum entropy densities with application to income distribution
- Censored Kullback-Leibler Information and Goodness-of-Fit Test with Type II Censored Data
- Information theoretic framework for process control
- Computation of maximum entropy Dirichlet for modeling lifetime data
- Testing goodness-of-fit for Laplace distribution based on maximum entropy
- Cumulative ratio information based on general cumulative entropy
- Bayesian hypothesis testing in latent variable models
- Multivariate maximum entropy identification, transformation, and dependence
- Information measures of Dirichlet distribution with applications
- Blog Recommendation and Management Implications in an Emergency Context: An Information Entropy Perspective
- An Entropy Frailty Model for Dependent Variables
- Silver jubilee issue
- Capital mobility, quasi-rents, and the competitive self-organization of distributions of profitability
- A Convergent Iterative Procedure for Constructing Bivariate Distributions
- An entropy characterization of the inverse Gaussian distribution and related goodness-of-fit test
- Bayesian nonparametric model selection and model testing
- An Estimator of Shannon Entropy of Beta-Generated Distributions and a Goodness-of-Fit Test
This page was built for publication: Information Distinguishability with Application to Analysis of Failure Data
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4844201)