eSPA+: Scalable Entropy-Optimal Machine Learning Classification for Small Data Problems
DOI10.1162/NECO_A_01490zbMATH Open1492.68117OpenAlexW4220974576WikidataQ114926517 ScholiaQ114926517MaRDI QIDQ5083584FDOQ5083584
Authors: Edoardo Vecchi, Lukáš Pospíšil, Steffen Albrecht, Terence J. O'Kane, Illia Horenko
Publication date: 20 June 2022
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/neco_a_01490
Recommendations
- On a scalable entropic breaching of the overfitting barrier for small data problems in machine learning
- Sparse classification: a scalable discrete optimization perspective
- Ensemble extreme learning machine and sparse representation classification
- Learning sparse classifiers: continuous and mixed integer optimization perspectives
- Efficient extreme learning machine via very sparse random projection
- Extreme entropy machines: robust information theoretic classification
- Prediction and estimation consistency of sparse multi-class penalized optimal scoring
- Classification of large datasets under small a priori information
- scientific article; zbMATH DE number 5968873
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05)
Cited In (4)
- On a scalable entropic breaching of the overfitting barrier for small data problems in machine learning
- Linearly scalable learning of smooth low-dimensional patterns with permutation-aided entropic dimension reduction
- Mini-workshop: Mathematics of entropic AI in the natural sciences. Abstracts from the mini-workshop held April 7--12, 2024
- Transport and scale interactions in geophysical flows. Abstracts from the workshop held July 16--21, 2023
This page was built for publication: eSPA+: Scalable Entropy-Optimal Machine Learning Classification for Small Data Problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5083584)