Nonmonotone BFGS-trained recurrent neural networks for temporal sequence processing
From MaRDI portal
Publication:628894
DOI10.1016/j.amc.2010.12.012zbMath1207.92004OpenAlexW2009170651MaRDI QIDQ628894
George D. Magoulas, Chun-Cheng Peng
Publication date: 8 March 2011
Published in: Applied Mathematics and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.amc.2010.12.012
quasi-Newton methodsrecurrent neural networkstemporal sequenceBFGS updatesnonmonotone methodssecond-order training algorithms
Lua error in Module:PublicationMSCList at line 37: attempt to index local 'msc_result' (a nil value).
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Analysis of a self-scaling quasi-Newton method
- Global convergence of nonmonotone descent methods for unconstrained optimization problems
- Numerical expirience with a class of self-scaling quasi-Newton algorithms
- Training the random neural network using quasi-Newton methods
- A quasi-discrete Newton algorithm with a nonmonotone stabilization technique
- Spatiotemporal Connectionist Networks: A Taxonomy and Review
- ADAPTIVE ALGORITHMS FOR NEURAL NETWORK SUPERVISED LEARNING: A DETERMINISTIC OPTIMIZATION APPROACH
- How bad are the BFGS and DFP methods when the objective function is quadratic?
- Variable Metric Method for Minimization
- Self-Scaling Variable Metric (SSVM) Algorithms
- On the alleviation of the problem of local minima in back-propagation
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- A Nonmonotone Line Search Technique for Newton’s Method
- A Rapidly Convergent Descent Method for Minimization
- A Family of Variable-Metric Methods Derived by Variational Means
- A new approach to variable metric algorithms
- The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations
- Conditioning of Quasi-Newton Methods for Function Minimization
This page was built for publication: Nonmonotone BFGS-trained recurrent neural networks for temporal sequence processing