On stochastic roundoff errors in gradient descent with low-precision computation
DOI10.1007/S10957-023-02345-7arXiv2202.12276MaRDI QIDQ6150643FDOQ6150643
Authors: Lu Xia, Stefano Massei, M. E. Hochstenbach, B. Koren
Publication date: 9 February 2024
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2202.12276
logistic regressionconvergence analysisneural networksgradient descent methodstochastic roundoff error analysislow-precision computation
General nonlinear regression (62J02) Roundoff error (65G50) General topics in artificial intelligence (68T01)
Cites Work
- Multinomial logistic regression algorithm
- Gradient Convergence in Gradient methods with Errors
- Accuracy and Stability of Numerical Algorithms
- An introduction to the theory of functional equations and inequalities. Cauchy's equation and Jensen's inequality. Edited by Attila Gilányi
- Applied logistic regression
- Stochastic Rounding and Its Probabilistic Backward Error Analysis
- Gradient descent optimizes over-parameterized deep ReLU networks
- Properties of the sign gradient descent algorithms
- Probability and conditional expectation. Fundamentals for the empirical sciences
- Simulating Low Precision Floating-Point Arithmetic
- Stochastic rounding and reduced-precision fixed-point arithmetic for solving neural ordinary differential equations
- Effects of round-to-nearest and stochastic rounding in the numerical solution of the heat equation in low precision
This page was built for publication: On stochastic roundoff errors in gradient descent with low-precision computation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6150643)