Minimum \(\phi\)-divergence estimation in misspecified multinomial models
From MaRDI portal
Publication:1942912
DOI10.1016/j.csda.2011.06.035zbMath1464.62099OpenAlexW1999796427MaRDI QIDQ1942912
V. Alba-Fernández, R. Pino-Mejías, M. Dolores Jiménez-Gamero, Juan Luis Moreno-Rebollo
Publication date: 14 March 2013
Published in: Computational Statistics and Data Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.csda.2011.06.035
consistencyasymptotic normalitymodel selectiongoodness-of-fitbootstrap distribution estimatorminimum phi-divergence estimator
Related Items
Statistical inference for multinomial populations based on a double index family of test statistics ⋮ Choice between and within the classes of Poisson-Tweedie and Poisson-exponential-Tweedie count models ⋮ Burbea-Rao divergence based statistics for testing uniform association ⋮ Two classes of divergence statistics for testing uniform association ⋮ Minimum \(K_\phi\)-divergence estimators for multinomial models and applications ⋮ Fourier methods for model selection ⋮ A criterion for local model selection ⋮ Model selection based on penalized \(\phi \)-divergences for multinomial data ⋮ Equivalence tests for multinomial data based on \(\phi\)-divergences
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Divergence statistics for testing uniform association in cross-classifications
- Minimum chi-square estimation and tests for model selection
- Some variants of minimum disparity estimation
- Parametric estimation and tests through divergences and the duality technique
- Estimate-based goodness-of-fit test for large sparse multinomial distributions
- Bounds for the bias of estimators under contamination
- Minimum disparity computation via the iteratively reweighted least integrated squares algorithms
- An extension of likelihood-ratio-test for testing linear hypotheses in the baseline-category logit model
- On disparity based goodness-of-fit tests for multinomial models
- Efficiency versus robustness: The case for minimum Hellinger distance and related methods
- An application of multiple comparison techniques to model selection
- A maximum entropy type test of fit
- Pathologies of some minimum distance estimators
- The ``automatic robustness of minimum distance functionals
- On robustness and efficiency of minimum divergence estimators
- Comments on testing economic theories and the use of model selection criteria
- Asymptotic divergence of estimates of discrete distributions
- Simulation study of the tests of uniform association based on the power-divergence
- Approximation Theorems of Mathematical Statistics
- Minimization of φ-divergences on sets of signed measures
- Bayesian Model Selection: Measuring the χ2Discrepancy with the Uniform Distribution
- Minimum Hellinger Distance Estimation for the Analysis of Count Data
- A new infinitesimal approach to robust estimation
- The generalized kullback-leibler divergence and robust inference
- Model selection tests for nonlinear dynamic models
- Testing Statistical Hypotheses
- Using the Penalized Likelihood Method for Model Selection with Nuisance Parameters Present only under the Alternative: An Application to Switching Regression Models
- Robust Estimation of a Location Parameter
- Linear Statistical Inference and its Applications
- Maximum Likelihood Estimation of Misspecified Models