Minimum \(\phi\)-divergence estimators with constraints in multinomial populations
From MaRDI portal
Publication:1600754
DOI10.1016/S0378-3758(01)00113-6zbMath0988.62014MaRDI QIDQ1600754
Leandro Pardo, Julio Angel Pardo, Konstantinos G. Zografos
Publication date: 16 June 2002
Published in: Journal of Statistical Planning and Inference (Search for Journal in Brave)
multinomial distribution; power divergence; noncentrality parameters; minimum phi-divergence estimators with constraints
62E20: Asymptotic distribution theory in statistics
62F10: Point estimation
62F30: Parametric inference under constraints
62B10: Statistical aspects of information-theoretic topics
Related Items
On tests of symmetry, marginal homogeneity and quasi-symmetry in two-way contingency tables based on minimum φ-divergence estimator with constraints, Phi-Divergence Statistics for Testing Linear Hypotheses in Logistic Regression Models, An approach to multiway contingency tables based on \(\phi \)-divergence test statistics, Minimum phi-divergence estimators for loglinear models with linear constraints and multinomial sampling, On tests of homogeneity based on minimum \(\varphi \)-divergence estimator with constraints, On tests of independence based on minimum \(\varphi \)-divergence estimator with constraints: An application to modeling DNA, An extension of likelihood-ratio-test for testing linear hypotheses in the baseline-category logit model, Phi-divergences and polytomous logistic regression models: An overview, Analysis of divergence in loglinear models when expected frequencies are subject to linear constraints, Conditional tests of marginal homogeneity based on \(\phi\)-divergence test statistics, Informative barycentres in statistics
Cites Work
- Maximum likelihood methods for linear and log-linear models in categorical data
- Goodness-of-fit statistics for discrete multivariate data
- Minimum Hellinger distance estimates for parametric models
- Efficiency versus robustness: The case for minimum Hellinger distance and related methods
- Asymptotic divergence of estimates of discrete distributions
- Divergence statistics: sampling properties and multinomial goodness of fit and divergence tests
- The Lagrangian Multiplier Test
- Maximum-Likelihood Estimation of Parameters Subject to Restraints
- Maximum Likelihood Methods for Log-Linear Models When Expected Frequencies are Subjected to Linear Constraints
- A New Proof of the Pearson-Fisher Theorem
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item