Universal Bayes consistency in metric spaces
From MaRDI portal
Publication:2054482
Abstract: We extend a recently proposed 1-nearest-neighbor based multiclass learning algorithm and prove that our modification is universally strongly Bayes-consistent in all metric spaces admitting any such learner, making it an "optimistically universal" Bayes-consistent learner. This is the first learning algorithm known to enjoy this property; by comparison, the -NN classifier and its variants are not generally universally Bayes-consistent, except under additional structural assumptions, such as an inner product, a norm, finite dimension, or a Besicovitch-type property. The metric spaces in which universal Bayes consistency is possible are the "essentially separable" ones -- a notion that we define, which is more general than standard separability. The existence of metric spaces that are not essentially separable is widely believed to be independent of the ZFC axioms of set theory. We prove that essential separability exactly characterizes the existence of a universal Bayes-consistent learner for the given metric space. In particular, this yields the first impossibility result for universal Bayes consistency. Taken together, our results completely characterize strong and weak universal Bayes consistency in metric spaces.
Recommendations
- A universal prior distribution for Bayesian consistency of non parametric procedures
- Universality of Bayesian predictions
- Strong consistency of nonparametric Bayes density estimation on compact metric spaces with applications to specific manifolds
- \(L^\infty\) metric criteria for convergence in Bayesian recursive inference systems
- On the uniform consistency of Bayes estimates for multinomial probabilities
- On Consistency of Bayes Procedures
- Universality of Bayesian mixture predictors
- On the consistency of Bayes estimates
Cites work
- scientific article; zbMATH DE number 3719441 (Why is no real title available?)
- scientific article; zbMATH DE number 1314876 (Why is no real title available?)
- scientific article; zbMATH DE number 797412 (Why is no real title available?)
- scientific article; zbMATH DE number 893887 (Why is no real title available?)
- scientific article; zbMATH DE number 7415073 (Why is no real title available?)
- scientific article; zbMATH DE number 6781365 (Why is no real title available?)
- A distribution-free theory of nonparametric regression
- Active nearest-neighbor learning in metric spaces
- Consistent Nonparametric Regression for Functional Data Under the Stone–Besicovitch Conditions
- Consistent nonparametric regression. Discussion
- Discriminatory Analysis. Nonparametric Discrimination: Consistency Properties
- Efficient Classification for Metric Data
- Exponential bounds of mean error for the nearest neighbor estimates of regression functions
- Forcings with ideals and simple forcing notions
- Functional Classification in Hilbert Spaces
- Near-Optimal Sample Compression for Nearest Neighbors
- Nearest neighbor classification in infinite dimension
- Nearest neighbor pattern classification
- Nearly optimal classification for semimetrics
- On the kernel rule for function classification
- PAC-Bayesian compression bounds on the prediction error of learning algorithms for classification
- Rates of Convergence of the Functional $k$-Nearest Neighbor Estimate
- Rates of convergence of nearest neighbor estimation under arbitrary sampling
- Set Theory
- Understanding machine learning. From theory to algorithms
- Uniform Central Limit Theorems
- Vitali covering theorem in Hilbert space
Cited in
(12)- scientific article; zbMATH DE number 7415094 (Why is no real title available?)
- Non-uniform packings
- Universal consistency of the \(k\)-NN rule in metric spaces and Nagata dimension
- A nearest neighbor characterization of Lebesgue points in metric measure spaces
- Nearest neighbor classification in infinite dimension
- Finsler currents
- From undecidability of non-triviality and finiteness to undecidability of learnability
- Fast convergence on perfect classification for functional data
- \(L^\infty\) metric criteria for convergence in Bayesian recursive inference systems
- Universal regression with adversarial responses
- Convergence of a stochastic collocation finite volume method for the compressible Navier-Stokes system
- Distributed adaptive nearest neighbor classifier: algorithm and theory
This page was built for publication: Universal Bayes consistency in metric spaces
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2054482)