Empirically Estimable Classification Bounds Based on a Nonparametric Divergence Measure
From MaRDI portal
Publication:4618147
DOI10.1109/TSP.2015.2477805zbMATH Open1412.94133DBLPjournals/tsp/BerishaWHS16OpenAlexW2218423201WikidataQ30364494 ScholiaQ30364494MaRDI QIDQ4618147FDOQ4618147
Authors: Visar Berisha, Alan Wisler, Alfred O. III Hero, Andreas Spanias
Publication date: 7 February 2019
Published in: IEEE Transactions on Signal Processing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1109/tsp.2015.2477805
Recommendations
- Divergence-based estimation and testing of statistical models of classification
- Upper-bound estimates for classifiers based on a dissimilarity function
- Divergence-based estimation and testing with misclassified data
- scientific article; zbMATH DE number 4203496
- On error-rate estimation in nonparametric classification
- On Some Nonparametric Density Estimators for a Statistical Classification Problem
- Nonparametric estimation of Kullback-Leibler divergence
- scientific article; zbMATH DE number 3917475
- Prediction Error Estimation Under Bregman Divergence for Non‐Parametric Regression and Classification
This page was built for publication: Empirically Estimable Classification Bounds Based on a Nonparametric Divergence Measure
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4618147)