Homotopy relaxation training algorithms for infinite-width two-layer ReLU neural networks
From MaRDI portal
Publication:6665313
DOI10.1007/S10915-024-02761-5MaRDI QIDQ6665313FDOQ6665313
Authors: Yahong Yang, Qipin Chen, Wenrui Hao
Publication date: 17 January 2025
Published in: Journal of Scientific Computing (Search for Journal in Brave)
Recommendations
Artificial neural networks and deep learning (68T07) Parallel algorithms in computer science (68W10) Numerical methods for mathematical programming, optimization and variational techniques (65K99)
Cites Work
- High-dimensional probability. An introduction with applications in data science
- Why does unsupervised pre-training help deep learning?
- Computing all solutions to polynomial systems using homotopy continuation
- The Numerical Solution of Systems of Polynomials Arising in Engineering and Science
- The deep Ritz method: a deep learning-based numerical algorithm for solving variational problems
- Scaling Limit of the Stein Variational Gradient Descent: The Mean Field Regime
- Accelerated optimization with orthogonality constraints
- A homotopy method for parameter estimation of nonlinear differential equations with multiple optima
- Sobolev training of thermodynamic-informed neural networks for interpretable elasto-plasticity models with level set hardening
- A homotopy training algorithm for fully connected neural networks
- Adaptive activation functions accelerate convergence in deep and physics-informed neural networks
- An adaptive homotopy tracking algorithm for solving nonlinear parametric systems with applications in nonlinear ODEs
- Locally adaptive activation functions with slope recovery for deep and physics-informed neural networks
- Side effects of learning from low-dimensional data embedded in a Euclidean space
- Greedy training algorithms for neural networks and applications to PDEs
- Title not available (Why is that?)
- Neural tangent kernel: convergence and generalization in neural networks (invited paper)
Cited In (1)
This page was built for publication: Homotopy relaxation training algorithms for infinite-width two-layer ReLU neural networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6665313)