A note on L1consistent estimation
From MaRDI portal
Publication:3817451
DOI10.2307/3314734zbMath0666.62040OpenAlexW2112122700MaRDI QIDQ3817451
Publication date: 1988
Published in: Canadian Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.2307/3314734
entropyrates of convergencedensity estimation\(L_ 1\)-distanceKolmogorov's chain argumentMinimum-distance estimatorsregular classes of measuresVapnik- Cervonenkis exponents
Nonparametric estimation (62G05) Point estimation (62F10) Order statistics; empirical distribution functions (62G30)
Related Items (4)
Almost sure classification of densities ⋮ The minimax learning rates of normal and Ising undirected graphical models ⋮ Plug-in L2-upper error bounds in deconvolution, for a mixing density estimate in Rd and for its derivatives, via the L1-error for the mixture ⋮ Minimum distance histograms with universal performance guarantees
Cites Work
- Probability inequalities for empirical processes and a law of the iterated logarithm
- Rates of convergence of minimum distance estimators and Kolmogorov's entropy
- Asymptotic methods in statistical decision theory
- Bounds for the uniform deviation of empirical measures
- Central limit theorems for empirical measures
- Asymptotic Minimax Character of the Sample Distribution Function and of the Classical Multinomial Estimator
- The Minimum Distance Method
- Probability Inequalities for Sums of Bounded Random Variables
- Distinguishability of Sets of Distributions
- On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities
This page was built for publication: A note on L1consistent estimation