Robustness of dual divergence estimators for models satisfying linear constraints
From MaRDI portal
Publication:351393
DOI10.1016/J.CRMA.2013.02.005zbMATH OpenNoneOpenAlexW2090458642MaRDI QIDQ351393FDOQ351393
Authors: Aida Toma
Publication date: 11 July 2013
Published in: Comptes Rendus. Mathématique. Académie des Sciences, Paris (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.crma.2013.02.005
Recommendations
- Dual divergence estimators and tests: robustness results
- Robust tests based on dual divergence estimators and saddlepoint approximations
- Robust estimation in generalized linear models: the density power divergence approach
- Optimal robust M-estimators using divergences
- Duality in robust linear regression using Huber's \(M\)-estimator
- Divergences and duality for estimation and test under moment condition models
- On robustness and efficiency of minimum divergence estimators
- Robust estimation in structural equation models using Bregman divergences
Cites Work
- Empirical likelihood and general estimating equations
- Empirical likelihood
- Title not available (Why is that?)
- Title not available (Why is that?)
- Decomposable pseudodistances and applications in statistical estimation
- Dual divergence estimators and tests: robustness results
- Robust tests based on dual divergence estimators and saddlepoint approximations
- Weighted empirical likelihood estimates and their robustness properties
- Dual representation of \(\phi\)-divergences and applications.
- An estimation method for the Neyman chi-square divergence with application to test of hypoth\-e\-ses
- Minimization of φ-divergences on sets of signed measures
- New estimates and tests of independence in some copula models
- Divergences and duality for estimation and test under moment condition models
- Parametric estimation and tests through divergences and the duality technique
Cited In (4)
- Robust generalized empirical likelihood for heavy tailed autoregressions with conditionally heteroscedastic errors
- New classes of Lorenz curves by maximizing Tsallis entropy under mean and Gini equality and inequality constraints
- Robust empirical likelihood
- Empirical phi-divergence test statistics for testing simple and composite null hypotheses
This page was built for publication: Robustness of dual divergence estimators for models satisfying linear constraints
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q351393)