Deterministic neural networks optimization from a continuous and energy point of view
From MaRDI portal
Publication:6111335
DOI10.1007/s10915-023-02215-4OpenAlexW4377988319MaRDI QIDQ6111335
Rodolphe Turpault, Bilel Bensaid, Gaël Poëtte
Publication date: 6 July 2023
Published in: Journal of Scientific Computing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10915-023-02215-4
Artificial neural networks and deep learning (68T07) Numerical optimization and variational techniques (65K10) Stability of solutions to ordinary differential equations (34D20) Numerical methods for initial value problems involving ordinary differential equations (65L05)
Cites Work
- Unnamed Item
- Unnamed Item
- Solving zero-dimensional algebraic systems
- Stability theory by Liapunov's direct method
- Sparse spatial autoregressions
- A priori neural networks versus a posteriori MOOD loop: a high accurate 1D FV scheme testing bed
- Understanding the acceleration phenomenon via high-resolution differential equations
- Machine learning design of volume of fluid schemes for compressible flows
- The modern mathematics of deep learning
- Physics-informed neural networks: a deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations
- Stability theory for ordinary differential equations
- The Rate of Convergence of Nesterov's Accelerated Forward-Backward Method is Actually Faster Than $1/k^2$
- A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
- Fuzzy neural network structure identification based on soft competitive learning
- Numerical integrators based on modified differential equations
- A variational perspective on accelerated methods in optimization
- On the Scope of the Method of Modified Equations
- Convergence and Dynamical Behavior of the ADAM Algorithm for Nonconvex Stochastic Optimization
- Solving systems of algebraic equations by using gröbner bases
- Some methods of speeding up the convergence of iteration methods
- Modern Error Analysis
- Goal-oriented sensitivity analysis of hyperparameters in deep learning
This page was built for publication: Deterministic neural networks optimization from a continuous and energy point of view