On Kullback-Leibler loss and density estimation
From MaRDI portal
Publication:1124241
DOI10.1214/aos/1176350606zbMath0678.62045OpenAlexW2081827563MaRDI QIDQ1124241
Publication date: 1987
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1214/aos/1176350606
asymptotic propertiesdiscrimination informationsmoothing parametertail propertieslikelihood cross-validationwindow widthnon-parametric kernel density estimateoptimal estimation of Kullback-Leibler loss
Related Items
Bayesian bandwidth estimation for a nonparametric functional regression model with unknown error density, Semiparametric efficient estimation of dynamic panel data models, Growth and convergence: a profile of distribution dynamics and mobility, Detecting conditional independence for modeling non-Gaussian time series, Selection of importance weights for monte carlo estimation of normalizing constants, The Mean Relative Entropy: An Invariant Measure of Estimation Error, Divergence measures estimation and its asymptotic normality theory in the discrete case, On two recent papers of Y. Kanazawa, A comparative study of several smoothing methods in density estimation, Asymptotically efficient estimation of a survival function in the missing censoring indicator model, Frequentist nonparametric goodness-of-fit tests via marginal likelihood ratios, Parameter identifiability with Kullback-Leibler information divergence criterion, Improving cross-validated bandwidth selection using subsampling-extrapolation techniques, Cross-validated density estimates based on Kullback–Leibler information, Akaike's information criterion and Kullback-Leibler loss for histogram density estimation, Performance study of marginal posterior density estimation via Kullback-Leibler divergence, Information theoretic learning with adaptive kernels, Extrapolation‐based Bandwidth Selectors: A Review and Comparative Study with Discussion on Bivariate Applications, Predictive Inference Based on Markov Chain Monte Carlo Output, Non-Gaussian Bayesian filtering by density parametrization using power moments, Nonparametric estimation of distributions with categorical and continuous data, Bayesian analysis of hypothesis testing problems for general population: A Kullback-Leibler alternative, Spline local basis methods for nonparametric density estimation, The generalized cross entropy method, with applications to probability density estimation, Copula density estimation by total variation penalized likelihood with linear equality constraints, Bayesian adaptive bandwidth kernel density estimation of irregular multivariate distributions, Conditional density estimation using the local Gaussian correlation, The locally Gaussian density estimator for multivariate data, Decomposition of Kullback-Leibler risk and unbiasedness for parameter-free estimators, On the choice of a discrepancy functional for model selection, A fast algorithm for computing least-squares cross-validations for nonparametric conditional kernel density functions, Sufficient dimension reduction with simultaneous estimation of effective dimensions for time-to-event data, On mutual information estimation for mixed-pair random variables, A nonparametric approach to k-sample inference based on entropy, Fast adaptive estimation of log-additive exponential models in Kullback-Leibler divergence, Bayesian nonparametrics for directional statistics, Smooth density estimation with moment constraints using mixture distributions, Semi-parametric dynamic time series modelling with applications to detecting neural dynamics, Transformation-based nonparametric estimation of multivariate densities, Bayesian bandwidth estimation for a semi-functional partial linear regression model with unknown error density, A survey of cross-validation procedures for model selection, Density estimation from aggregate data, Non-parametric estimation of Kullback-Leibler discrimination information based on censored data, Direct importance estimation for covariate shift adaptation, Cross-validation Revisited, The slashed-Rayleigh fading channel distribution, On the estimation of entropy, Kernel smoothed probability mass functions for ordered datatypes, Generalized SURE for optimal shrinkage of singular values in low-rank matrix denoising, A semi-Bayesian method for nonparametric density estimation., Robust kernels for kernel density estimation, Rates of convergence for the Gaussian mixture sieve., Likelihood cross-validation bandwidth selection for nonparametric kernel density estimators†, Maximum smoothed likelihood density estimation, Information measures of kernel estimation, On the recovery of joint distributions from limited information, Comments on a data based bandwidth selector, Hellinger distance and Kullback-Leibler loss for the kernel density estimator, An assessment of finite sample performance of adaptive methods in density estimation