Blessing of dimensionality: mathematical foundations of the statistical physics of data
From MaRDI portal
Publication:5154201
DOI10.1098/rsta.2017.0237zbMath1470.82004arXiv1801.03421OpenAlexW3099661174WikidataQ52647993 ScholiaQ52647993MaRDI QIDQ5154201
Alexander N. Gorban, I. Yu. Tyukin
Publication date: 4 October 2021
Published in: Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1801.03421
extreme pointsartificial intelligencestatistical physicsstatisticslinear separabilityFisher's discriminantapplied mathematicsmeasure concentrationensemble equivalence
Related Items
On a posteriori estimation of the approximation error norm for an ensemble of independent solutions, On a posteriori error estimation using distances between numerical solutions and angles between truncation errors, Blessing of dimensionality at the edge and geometry of few-shot learning, General stochastic separation theorems with optimal bounds, Revisiting `survival of the fittest' principle in global stochastic optimisation: incorporating anisotropic mutations, Generalised Watson distribution on the hypersphere with applications to clustering, Correction of AI systems by linear discriminants: probabilistic foundations, Fast construction of correcting ensembles for legacy artificial intelligence systems: algorithms and a case study, Hilbert’s sixth problem: the endless road to rigour, Global optimisation in Hilbert spaces using the survival of the fittest algorithm, Modelling biological evolution: developing novel approaches, High-dimensional brain: a tool for encoding and rapid learning of memories by single neurons, Tensor train based isogeometric analysis for PDE approximation on parameter dependent geometries, Replica analysis of Bayesian data clustering, The independent component analysis with the linear regression – predicting the energy costs of the public sector buildings in Croatia
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Oded Schramm 1961--2008
- A simplified neuron model as a principal component analyzer
- David Hilbert and the axiomatization of physics (1894--1905)
- Concentration of measure and isoperimetric inequalities in product spaces
- Probabilistic lower bounds for approximation by shallow perceptron networks
- One-trial correction of legacy AI systems and stochastic separation theorems
- Piece-wise quadratic approximations of arbitrary error functions for fast and robust machine learning
- Approximation with random bases: pro et contra
- High-dimensional brain: a tool for encoding and rapid learning of memories by single neurons
- Stochastic separation theorems
- Quasiorthogonal dimension of Euclidean spaces
- Principal manifolds for data visualization and dimension reduction. Reviews and original papers presented partially at the workshop `Principal manifolds for data cartography and dimension reduction', Leicester, UK, August 24--26, 2006.
- Is the \(k\)-NN classifier in high dimensions affected by the curse of dimensionality?
- Data complexity measured by principal graphs
- Measure-preserving homeomorphisms and metrical transitivity
- Statistical Mechanics of Learning
- On the mathematical foundations of learning
- Clustering
- Extensions of Lipschitz mappings into a Hilbert space
- Learning Deep Architectures for AI
- Generic Hamiltonian dynamical systems are neither integrable nor ergodic
- An elementary proof of a theorem of Johnson and Lindenstrauss
- Data Mining
- Training a Support Vector Machine in the Primal
- Probability Inequalities for Sums of Bounded Random Variables
- Concentration property on probability spaces.