Nonmonotone BFGS-trained recurrent neural networks for temporal sequence processing
From MaRDI portal
(Redirected from Publication:628894)
Recommendations
- Nonmonotone learning of recurrent neural networks in symbolic sequence processing applications
- scientific article; zbMATH DE number 7255038
- Recurrent Neural Networks Training Using Derivative Free Nonlinear Bayesian Filters
- A conjugate gradient learning algorithm for recurrent neural networks
- On the improvement of the real time recurrent learning algorithm for recurrent neural networks
Cites work
- scientific article; zbMATH DE number 51537 (Why is no real title available?)
- scientific article; zbMATH DE number 1243473 (Why is no real title available?)
- scientific article; zbMATH DE number 1179315 (Why is no real title available?)
- scientific article; zbMATH DE number 1843095 (Why is no real title available?)
- scientific article; zbMATH DE number 938958 (Why is no real title available?)
- scientific article; zbMATH DE number 3294173 (Why is no real title available?)
- scientific article; zbMATH DE number 3308852 (Why is no real title available?)
- A Family of Variable-Metric Methods Derived by Variational Means
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- A Nonmonotone Line Search Technique for Newton’s Method
- A Rapidly Convergent Descent Method for Minimization
- A new approach to variable metric algorithms
- A quasi-discrete Newton algorithm with a nonmonotone stabilization technique
- ADAPTIVE ALGORITHMS FOR NEURAL NETWORK SUPERVISED LEARNING: A DETERMINISTIC OPTIMIZATION APPROACH
- Analysis of a self-scaling quasi-Newton method
- Conditioning of Quasi-Newton Methods for Function Minimization
- Global convergence of nonmonotone descent methods for unconstrained optimization problems
- How bad are the BFGS and DFP methods when the objective function is quadratic?
- Numerical expirience with a class of self-scaling quasi-Newton algorithms
- On the alleviation of the problem of local minima in back-propagation
- Self-Scaling Variable Metric (SSVM) Algorithms
- Spatiotemporal connectionist networks: A taxonomy and review
- The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations
- Training the random neural network using quasi-Newton methods
- Variable Metric Method for Minimization
Cited in
(5)- Multi-stage ordinal optimization based approach for job shop scheduling problems
- A novel method for speed training acceleration of recurrent neural networks
- Nonmonotone learning of recurrent neural networks in symbolic sequence processing applications
- Function Space BFGS Quasi-Newton Learning Algorithm for Time-Varying Recurrent Neural Networks
- An augmented Lagrangian method for training recurrent neural networks
This page was built for publication: Nonmonotone BFGS-trained recurrent neural networks for temporal sequence processing
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q628894)