Minimization algorithms based on supervisor and searcher cooperation
From MaRDI portal
Publication:5956435
DOI10.1023/A:1011986402461zbMath1032.90020MaRDI QIDQ5956435
Publication date: 2001
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Related Items
Alternate step gradient method*, Adaptive finite element methods for the identification of elastic constants, A descent algorithm without line search for unconstrained optimization, Novel algorithms for noisy minimization problems with applications to neural networks training, On the asymptotic behaviour of some new gradient methods
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A stochastic quasigradient algorithm with variable metric
- Stochastic approximation methods for constrained and unconstrained systems
- Optimization algorithm with probabilistic estimation
- Recent progress in unconstrained nonlinear optimization without derivatives
- Optimization via simulation: A review
- A method of trust region type for minimizing noisy functions
- R-linear convergence of the Barzilai and Borwein gradient method
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- Two-Point Step Size Gradient Methods
- Testing Unconstrained Optimization Software
- Nelder-Mead Simplex Modifications for Simulation Optimization
- A Nonmonotone Line Search Technique for Newton’s Method
- On the Barzilai and Borwein choice of steplength for the gradient method
- Stochastic Estimation of the Maximum of a Regression Function
- A Stochastic Approximation Method