Regularization tools for training large feed-forward neural networks using automatic differentiation∗
From MaRDI portal
Publication:4227926
DOI10.1080/10556789808805701zbMath0913.68177MaRDI QIDQ4227926
Per Lindström, Mårten Gulliksson, Jerry Eriksson, Per-Åke Wedin
Publication date: 25 May 1999
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/10556789808805701
68T05: Learning and adaptive systems in artificial intelligence
68W10: Parallel algorithms in computer science
Related Items
Local results for the Gauss-Newton method on constrained rank-deficient nonlinear least squares, Variable projections neural network training, KKT conditions for rank-deficient nonlinear least-square problems with rank-deficient nonlinear constraints
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- An implicit shift bidiagonalization algorithm for ill-posed systems
- A new linesearch algorithm for nonlinear least squares problems
- Inexact Newton Methods
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Restart procedures for the conjugate gradient method