Blessing of dimensionality: mathematical foundations of the statistical physics of data
DOI10.1098/RSTA.2017.0237zbMATH Open1470.82004arXiv1801.03421OpenAlexW3099661174WikidataQ52647993 ScholiaQ52647993MaRDI QIDQ5154201FDOQ5154201
Alexander N. Gorban, I. Tyukin
Publication date: 4 October 2021
Published in: Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1801.03421
Recommendations
statisticsartificial intelligencelinear separabilityextreme pointsstatistical physicsFisher's discriminantapplied mathematicsensemble equivalencemeasure concentration
Cites Work
- Learning deep architectures for AI
- Pattern classification.
- Extensions of Lipschitz mappings into a Hilbert space
- An elementary proof of a theorem of Johnson and Lindenstrauss
- Title not available (Why is that?)
- Probability Inequalities for Sums of Bounded Random Variables
- On the mathematical foundations of learning
- The concentration of measure phenomenon
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Concentration of measure and isoperimetric inequalities in product spaces
- Title not available (Why is that?)
- Title not available (Why is that?)
- Generic Hamiltonian dynamical systems are neither integrable nor ergodic
- A simplified neuron model as a principal component analyzer
- Title not available (Why is that?)
- Clustering. A data recovery approach.
- Training a Support Vector Machine in the Primal
- Measure-preserving homeomorphisms and metrical transitivity
- Is the \(k\)-NN classifier in high dimensions affected by the curse of dimensionality?
- Data Mining
- Concentration property on probability spaces.
- Statistical mechanics of learning
- David Hilbert and the axiomatization of physics (1894--1905)
- Title not available (Why is that?)
- Approximation with random bases: pro et contra
- Principal manifolds for data visualization and dimension reduction. Reviews and original papers presented partially at the workshop `Principal manifolds for data cartography and dimension reduction', Leicester, UK, August 24--26, 2006.
- Oded Schramm 1961--2008
- Probabilistic lower bounds for approximation by shallow perceptron networks
- Title not available (Why is that?)
- Quasiorthogonal dimension of Euclidean spaces
- Stochastic separation theorems
- Title not available (Why is that?)
- One-trial correction of legacy AI systems and stochastic separation theorems
- Piece-wise quadratic approximations of arbitrary error functions for fast and robust machine learning
- High-dimensional brain: a tool for encoding and rapid learning of memories by single neurons
- Data complexity measured by principal graphs
Cited In (18)
- Fast construction of correcting ensembles for legacy artificial intelligence systems: algorithms and a case study
- Hilbert’s sixth problem: the endless road to rigour
- Revisiting `survival of the fittest' principle in global stochastic optimisation: incorporating anisotropic mutations
- Global optimisation in Hilbert spaces using the survival of the fittest algorithm
- General stochastic separation theorems with optimal bounds
- Coping with AI errors with provable guarantees
- Generalised Watson distribution on the hypersphere with applications to clustering
- The independent component analysis with the linear regression – predicting the energy costs of the public sector buildings in Croatia
- Ignorance is a bliss: Mathematical structure of many-box models
- On a posteriori error estimation using distances between numerical solutions and angles between truncation errors
- Blessing of dimensionality at the edge and geometry of few-shot learning
- Replica analysis of Bayesian data clustering
- Correction of AI systems by linear discriminants: probabilistic foundations
- Approximation of classifiers by deep perceptron networks
- High-dimensional brain: a tool for encoding and rapid learning of memories by single neurons
- Modelling biological evolution: developing novel approaches
- On a posteriori estimation of the approximation error norm for an ensemble of independent solutions
- Tensor train based isogeometric analysis for PDE approximation on parameter dependent geometries
This page was built for publication: Blessing of dimensionality: mathematical foundations of the statistical physics of data
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5154201)