Rates of convergence of minimum distance estimators and Kolmogorov's entropy
From MaRDI portal
Publication:1064711
DOI10.1214/aos/1176349553zbMath0576.62057OpenAlexW2063178190MaRDI QIDQ1064711
Publication date: 1985
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1214/aos/1176349553
rate of convergencedensity estimationKolmogorov entropyempirical measureRobust minimum distance estimators
Related Items
Performance of discrete associated kernel estimators through the total variation distance, Almost sure classification of densities, Estimation and selection procedures in regression: anL1approach, A universally acceptable smoothing factor for kernel density estimates, Parameter selection in modified histogram estimates, Minimax optimal conditional density estimation under total variation smoothness, Universal smoothing factor selection in density estimation: theory and practice. (With discussion), Rates of convergence of estimates, Kolmogorov's entropy and the dimensionality reduction principle in regression, Nonasymptotic universal smoothing factors, kernel complexity and Yatracos classes, On a dense minimizer of empirical risk in inverse problems, On robustness and local differential privacy, A theory of transfer learning with applications to active learning, Bounds on the minimax rate for estimating a prior over a VC class from independent learning tasks, Estimation of Copulas via Maximum Mean Discrepancy, Nonparametric estimation of infinite order regression and its application to the risk-return tradeoff, Selecting iterated modified histograms, Robust Estimators in High-Dimensions Without the Computational Intractability, \(L_ 1\)-optimal estimates for a regression type function in \(R^ d\), A note on L1consistent estimation, A general lower bound of minimax risk for absolute‐error loss, A note on minimum distance estimation of copula densities, Dependence and the dimensionality reduction principle, Density estimation by the penalized combinatorial method, Unnamed Item, Plug-in L2-upper error bounds in deconvolution, for a mixing density estimate in Rd and for its derivatives, via the L1-error for the mixture, Tests and estimation strategies associated to some loss functions, LOCALIZED MODEL SELECTION FOR REGRESSION, Minimum distance histograms with universal performance guarantees, Strongly consistent model selection for densities, Minimum distance regression-type estimates with rates under weak dependence, A note on penalized minimum distance estimation in nonparametric regression, Finite sample properties of parametric MMD estimation: robustness to misspecification and dependence, Nonparametric density estimates with improved . performance on given sets of densities, Information-theoretic determination of minimax rates of convergence, Optimal \(L_{1}\) bandwidth selection for variable kernel density estimates, Unnamed Item, The Consistency and Robustness of Modified Cramér–Von Mises and Kolmogorov–Cramér Estimators, Aggregating estimates by convex optimization, Limitations of the Wasserstein MDE for univariate data, Learning Poisson binomial distributions