Correction of AI systems by linear discriminants: probabilistic foundations
big dataerror correctionmeasure concentrationblessing of dimensionalitylinear discriminantnon-iterative learning
Probability distributions: general theory (60E05) Learning and adaptive systems in artificial intelligence (68T05) Characterization and structure theory for multivariate probability distributions; copulas (62H05) Computational aspects of data analysis and big data (68T09) Geometric probability and stochastic geometry (60D05) Inequalities; stochastic orderings (60E15) General topics in artificial intelligence (68T01)
- Fast construction of correcting ensembles for legacy artificial intelligence systems: algorithms and a case study
- One-trial correction of legacy AI systems and stochastic separation theorems
- Stochastic separation theorems
- General stochastic separation theorems with optimal bounds
- Blessing of dimensionality: mathematical foundations of the statistical physics of data
- scientific article; zbMATH DE number 3126094 (Why is no real title available?)
- scientific article; zbMATH DE number 1182755 (Why is no real title available?)
- scientific article; zbMATH DE number 1391397 (Why is no real title available?)
- Approximation of the Sphere by Polytopes having Few Vertices
- Approximation with random bases: pro et contra
- Blessing of dimensionality: mathematical foundations of the statistical physics of data
- Concentration of measure and isoperimetric inequalities in product spaces
- Concentration property on probability spaces.
- Convexity. An analytic viewpoint
- Deep learning
- Fast kernel classifiers with online and active learning
- From Brunn-Minkowski to Brascamp-Lieb and to logarithmic Sobolev inequalities
- Geometry of isotropic convex bodies
- Interpolating thin-shell and sharp large-deviation estimates for isotropic log-concave measures
- Is the \(k\)-NN classifier in high dimensions affected by the curse of dimensionality?
- Isoperimetric and analytic inequalities for log-concave probability measures
- Observed universality of phase transitions in high-dimensional geometry, with implications for modern data analysis and signal processing
- On the Geometry of Log-Concave Probability Measures with Bounded Log-Sobolev Constant
- On the mathematical foundations of learning
- On the shape of the convex hull of random points
- Probabilistic lower bounds for approximation by shallow perceptron networks
- Quasiorthogonal dimension of Euclidean spaces
- Small ball probability estimates for log-concave measures
- Stochastic separation theorems
- The geometry of logconcave functions and sampling algorithms
- Training a Support Vector Machine in the Primal
- Universal approximation bounds for superpositions of a sigmoidal function
- Scrutinizing XAI using linear ground-truth data with suppressor variables
- Fast construction of correcting ensembles for legacy artificial intelligence systems: algorithms and a case study
- One-trial correction of legacy AI systems and stochastic separation theorems
- Approximation of classifiers by deep perceptron networks
- Machine learning approach to the Floquet-Lindbladian problem
- General stochastic separation theorems with optimal bounds
- Coping with AI errors with provable guarantees
- Blessing of dimensionality at the edge and geometry of few-shot learning
- scientific article; zbMATH DE number 5367516 (Why is no real title available?)
- Stochastic separation theorems
This page was built for publication: Correction of AI systems by linear discriminants: probabilistic foundations
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2200569)