Nonparametric -divergence estimation and test for model selection
zbMATH Open1455.62078MaRDI QIDQ2217079FDOQ2217079
Authors: Freedath Djibril Moussa, Jean de Dieu Nkurunziza, Papa Ngom
Publication date: 18 December 2020
Published in: Afrika Statistika (Search for Journal in Brave)
Full work available at URL: https://projecteuclid.org/euclid.as/1605236416
Recommendations
asymptotic normalitymodel selectionstrong consistencyhypothesis testingnonparametric estimation\(\phi\)-divergence
Density estimation (62G07) Nonparametric hypothesis testing (62G10) Asymptotic properties of nonparametric inference (62G20)
Cites Work
- Weak convergence and empirical processes. With applications to statistics
- Title not available (Why is that?)
- On Information and Sufficiency
- Uniform in bandwidth consistency of kernel-type function estimators
- U-processes: Rates of convergence
- Introduction to nonparametric estimation
- Minimum Hellinger distance estimates for parametric models
- Robust and efficient estimation by minimising a density power divergence
- Title not available (Why is that?)
- Title not available (Why is that?)
- Weighted uniform consistency of kernel density estimators with general bandwidth sequences
- On nonparametric kernel density estimates
- Cross-validation in nonparametric estimation of probabilities and probability densities
- Divergence measures estimation and its asymptotic normality theory using wavelets empirical processes. II
- Uniform-in-bandwidth consistency for kernel-type estimators of Shannon's entropy
- A new kernel density estimate
Cited In (4)
This page was built for publication: Nonparametric \(\phi\)-divergence estimation and test for model selection
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2217079)