Bias Reduction and Metric Learning for Nearest-Neighbor Estimation of Kullback-Leibler Divergence
From MaRDI portal
Publication:5157210
DOI10.1162/neco_a_01092zbMath1471.62301OpenAlexW2808007932WikidataQ89103007 ScholiaQ89103007MaRDI QIDQ5157210
Yung-Kyun Noh, Frank Chongwoo Park, Masashi Sugiyama, Song Liu, Daniel D. Lee, Marthinus Christoffel du Plessis
Publication date: 12 October 2021
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/neco_a_01092
Related Items
Some results about kernel estimators for function derivatives based on stationary and ergodic continuous time processes with applications, Nonparametric recursive estimation for multivariate derivative functions by stochastic approximation method, Direct Density Derivative Estimation, Uniform convergence rate of the kernel regression estimator adaptive to intrinsic dimension in presence of censored data, Asymptotics for function derivatives estimators based on stationary and ergodic discrete time processes, Uniform almost sure convergence and asymptotic distribution of the wavelet-based estimators of partial derivatives of multivariate density function under weak dependence
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Detecting Novel Associations in Large Data Sets
- Direct importance estimation for covariate shift adaptation
- Correction: A class of Rényi information estimators for multidimensional densities
- A class of Rényi information estimators for multidimensional densities
- Approximate information discriminant analysis: A computationally simple heteroscedastic feature extraction technique
- Change-point detection in time-series data by relative density-ratio estimation
- Universal Estimation of Information Measures for Analog Sources
- A new class of random vector entropy estimators and its applications in testing statistical hypotheses
- Sequential change‐point detection based on direct density‐ratio estimation
- Divergence Estimation for Multidimensional Densities Via $k$-Nearest-Neighbor Distances
- $f$-Divergence Estimation and Two-Sample Homogeneity Test Under Semiparametric Density-Ratio Models
- Estimating Divergence Functionals and the Likelihood Ratio by Convex Risk Minimization
- A Nonparametric Estimate of a Multivariate Density Function