Divergence Estimation of Continuous Distributions Based on Data-Dependent Partitions
From MaRDI portal
Publication:3547769
Recommendations
- Information divergence estimation based on data-dependent partitions
- Universal Divergence Estimation for Finite-Alphabet Sources
- Universal estimation of information measures for analog sources
- Nonparametric estimation of Kullback-Leibler divergence
- Direct Approximation of Divergences Between Probability Distributions
Cited in
(18)- The fractional Kullback–Leibler divergence
- On the empirical estimation of integral probability metrics
- Nonparametric \(e\)-mixture estimation
- The detection of local irreversibility in time series based on segmentation
- Universal Divergence Estimation for Finite-Alphabet Sources
- Universal estimation of information measures for analog sources
- Approximate Bayesian computation via classification
- On the convergence of Shannon differential entropy, and its connections with density and entropy estimation
- Information divergence estimation based on data-dependent partitions
- Non-parametric two-sample tests: recent developments and prospects
- Least-squares two-sample test
- Machine learning with squared-loss mutual information
- Parametric Bayesian estimation of differential entropy and relative entropy
- Robust measurement of (heavy-tailed) risks: theory and implementation
- scientific article; zbMATH DE number 7625192 (Why is no real title available?)
- A Practical Approach to Proper Inference with Linked Data
- Density-difference estimation
- Probabilistic model validation for uncertain nonlinear systems
This page was built for publication: Divergence Estimation of Continuous Distributions Based on Data-Dependent Partitions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3547769)