Nonmonotone BFGS-trained recurrent neural networks for temporal sequence processing
DOI10.1016/J.AMC.2010.12.012zbMATH Open1207.92004OpenAlexW2009170651MaRDI QIDQ628894FDOQ628894
George D. Magoulas, Chun-Cheng Peng
Publication date: 8 March 2011
Published in: Applied Mathematics and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.amc.2010.12.012
Recommendations
- Nonmonotone learning of recurrent neural networks in symbolic sequence processing applications
- scientific article; zbMATH DE number 7255038
- Recurrent Neural Networks Training Using Derivative Free Nonlinear Bayesian Filters
- A conjugate gradient learning algorithm for recurrent neural networks
- On the improvement of the real time recurrent learning algorithm for recurrent neural networks
quasi-Newton methodsrecurrent neural networkstemporal sequenceBFGS updatesnonmonotone methodssecond-order training algorithms
Learning and adaptive systems in artificial intelligence (68T05) Applications of mathematical programming (90C90) Neural networks for/in biological studies, artificial life and related topics (92B20)
Cites Work
- Variable Metric Method for Minimization
- A Rapidly Convergent Descent Method for Minimization
- Title not available (Why is that?)
- A Family of Variable-Metric Methods Derived by Variational Means
- A new approach to variable metric algorithms
- Conditioning of Quasi-Newton Methods for Function Minimization
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- Title not available (Why is that?)
- A Nonmonotone Line Search Technique for Newton’s Method
- The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations
- Title not available (Why is that?)
- Self-Scaling Variable Metric (SSVM) Algorithms
- Global convergence of nonmonotone descent methods for unconstrained optimization problems
- Numerical expirience with a class of self-scaling quasi-Newton algorithms
- How bad are the BFGS and DFP methods when the objective function is quadratic?
- Title not available (Why is that?)
- On the alleviation of the problem of local minima in back-propagation
- Analysis of a self-scaling quasi-Newton method
- Training the random neural network using quasi-Newton methods
- A quasi-discrete Newton algorithm with a nonmonotone stabilization technique
- Spatiotemporal connectionist networks: A taxonomy and review
- ADAPTIVE ALGORITHMS FOR NEURAL NETWORK SUPERVISED LEARNING: A DETERMINISTIC OPTIMIZATION APPROACH
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
Cited In (2)
This page was built for publication: Nonmonotone BFGS-trained recurrent neural networks for temporal sequence processing
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q628894)