Machine learning from a continuous viewpoint. I

From MaRDI portal
Publication:829085

DOI10.1007/S11425-020-1773-8zbMATH Open1472.68136arXiv1912.12777OpenAlexW3101985406MaRDI QIDQ829085FDOQ829085


Authors: N. E. Zubov Edit this on Wikidata


Publication date: 5 May 2021

Published in: Science China. Mathematics (Search for Journal in Brave)

Abstract: We present a continuous formulation of machine learning, as a problem in the calculus of variations and differential-integral equations, in the spirit of classical numerical analysis. We demonstrate that conventional machine learning models and algorithms, such as the random feature model, the two-layer neural network model and the residual neural network model, can all be recovered (in a scaled form) as particular discretizations of different continuous formulations. We also present examples of new models, such as the flow-based random feature model, and new algorithms, such as the smoothed particle method and spectral method, that arise naturally from this continuous formulation. We discuss how the issues of generalization error and implicit regularization can be studied under this framework.


Full work available at URL: https://arxiv.org/abs/1912.12777




Recommendations




Cites Work


Cited In (29)

Uses Software





This page was built for publication: Machine learning from a continuous viewpoint. I

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q829085)