Any Discrimination Rule Can Have an Arbitrarily Bad Probability of Error for Finite Sample Size
From MaRDI portal
Publication:3943843
DOI10.1109/TPAMI.1982.4767222zbMath0484.62072OpenAlexW2074619954MaRDI QIDQ3943843
Publication date: 1982
Published in: IEEE Transactions on Pattern Analysis and Machine Intelligence (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1109/tpami.1982.4767222
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Bayesian problems; characterization of Bayes procedures (62C10)
Related Items (16)
Learning and Convergence of the Normalized Radial Basis Functions Networks ⋮ Online gradient descent algorithms for functional data learning ⋮ Fast learning rates in statistical inference through aggregation ⋮ Nonlinear black-box models in system identification: Mathematical foundations ⋮ Consistency of support vector machines using additive kernels for additive models ⋮ Lower bounds on the rate of convergence of nonparametric regression estimates ⋮ Rates of convergence for partitioning and nearest neighbor regression estimates with unbounded data ⋮ Convergence properties of functional estimates for discrete distributions ⋮ Bandwidth choice for nonparametric classification ⋮ Supervised Learning by Support Vector Machines ⋮ Optimal global rates of convergence for nonparametric regression with unbounded data ⋮ Measuring the Capacity of Sets of Functions in the Analysis of ERM ⋮ Asymptotic expansions of the \(k\) nearest neighbor risk ⋮ Asymptotic normality of support vector machine variants and other regularized kernel methods ⋮ Lower bounds for the rate of convergence in nonparametric pattern recognition ⋮ Probability estimation with machine learning methods for dichotomous and multicategory outcome: Theory
This page was built for publication: Any Discrimination Rule Can Have an Arbitrarily Bad Probability of Error for Finite Sample Size