An efficient correction to the density-based empirical likelihood ratio goodness-of-fit test for the inverse Gaussian distribution
DOI10.1080/02664763.2016.1156657OpenAlexW2313312265MaRDI QIDQ5138232FDOQ5138232
Authors: Hadi Alizadeh Noughabi, Albert Vexler
Publication date: 3 December 2020
Published in: Journal of Applied Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/02664763.2016.1156657
Recommendations
- An empirical likelihood ratio based goodness-of-fit test for inverse Gaussian distributions
- Goodness-of-fit tests for the inverse Gaussian distribution based on the empirical Laplace transform
- Goodness-of-fit tests for the inverse Gaussian and related distributions
- An entropy characterization of the inverse Gaussian distribution and related goodness-of-fit test
- Exact EDF Goodness-of-Fit Tests for Inverse Gaussian Distributions
goodness-of-fit testsKullback-Leibler informationinverse Gaussian distributionempirical likelihood ratiominimum discrimination information loss estimator
Cites Work
- Title not available (Why is that?)
- Empirical likelihood
- Title not available (Why is that?)
- An empirical likelihood ratio based goodness-of-fit test for inverse Gaussian distributions
- Empirical likelihood ratios applied to goodness-of-fit tests based on sample entropy
- The inverse Gaussian distribution. Statistical theory and applications
- Information indices: Unification and applications.
- Title not available (Why is that?)
- Principal Information Theoretic Approaches
- Title not available (Why is that?)
- A goodness-of-fit test for the inverse Gaussian distribution using its independence characterization
- Information Distinguishability with Application to Analysis of Failure Data
- A Simple Density-Based Empirical Likelihood Ratio Test for Independence
- Two measures of sample entropy
- Correcting moments for goodness of fit tests based on two entropy estimates
- A note on optimality of hypothesis testing
- An entropy characterization of the inverse Gaussian distribution and related goodness-of-fit test
- A modified Kolmogorov-Smirnov test for the inverse gaussian density with unknown parameters
- A Note on Electrical Networks and the Inverse Gaussian Distribution
- A two-sample empirical likelihood ratio test based on samples entropy
- Nonparametric-likelihood inference based on cost-effectively-sampled-data
- An extensive power evaluation of a novel two-sample density-based empirical likelihood ratio test for paired data with an application to a treatment study of attention-deficit/hyperactivity disorder and severe mood dysregulation
- Goodness-of-fit tests based on correcting moments of entropy estimators
- Computing critical values of exact tests by incorporating Monte Carlo simulations combined with statistical tables
- Two-sample density-based empirical likelihood tests for incomplete data in application to a pneumonia study
- Likelihood testing populations modeled by autoregressive process subject to the limit of detection in applications to longitudinal biomedical data
Cited In (5)
- Monte Carlo comparison of goodness-of-fit tests for the inverse Gaussian distribution based on empirical distribution function
- An empirical likelihood ratio based goodness-of-fit test for inverse Gaussian distributions
- Nonparametric probability density functions of entropy estimators applied to testing the Rayleigh distribution
- Moments of nonparametric probability density functions of entropy estimators applied to testing the inverse Gaussian distribution
- On the power of Gini index-based goodness-of-fit test for the inverse Gaussian distribution
Uses Software
This page was built for publication: An efficient correction to the density-based empirical likelihood ratio goodness-of-fit test for the inverse Gaussian distribution
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5138232)