Minimization algorithms based on supervisor and searcher cooperation
From MaRDI portal
Publication:5956435
DOI10.1023/A:1011986402461zbMATH Open1032.90020MaRDI QIDQ5956435FDOQ5956435
Authors: Wenbin Liu, Yuhong Dai
Publication date: 2001
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Recommendations
- Novel supervisor-searcher cooperation algorithms for minimization problems with strong noise*
- Novel algorithms for noisy minimization problems with applications to neural networks training
- A stochastic steepest-descent algorithm
- Publication:3481506
- Stochastic approximation algorithm for minimax problems
Cites Work
- Recent progress in unconstrained nonlinear optimization without derivatives
- Testing Unconstrained Optimization Software
- Title not available (Why is that?)
- Stochastic approximation methods for constrained and unconstrained systems
- Title not available (Why is that?)
- A Stochastic Approximation Method
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- Two-Point Step Size Gradient Methods
- A Nonmonotone Line Search Technique for Newton’s Method
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Stochastic Estimation of the Maximum of a Regression Function
- \(R\)-linear convergence of the Barzilai and Borwein gradient method
- Nelder-Mead Simplex Modifications for Simulation Optimization
- On the Barzilai and Borwein choice of steplength for the gradient method
- Title not available (Why is that?)
- Optimization via simulation: A review
- Optimization algorithm with probabilistic estimation
- A method of trust region type for minimizing noisy functions
- A stochastic quasigradient algorithm with variable metric
Cited In (16)
- On the Barzilai-Borwein basic scheme in FFT-based computational homogenization
- Prediction-correction method with BB step sizes
- On the asymptotic behaviour of some new gradient methods
- A new spectral method for \(l_1\)-regularized minimization
- Novel supervisor-searcher cooperation algorithms for minimization problems with strong noise*
- Alternate step gradient method*
- Novel algorithms for noisy minimization problems with applications to neural networks training
- Batch gradient method with smoothing \(L_{1/2}\) regularization for training of feedforward neural networks
- Adaptive finite element methods for the identification of elastic constants
- A descent algorithm without line search for unconstrained optimization
- LMBOPT: a limited memory method for bound-constrained optimization
- An active set method for bound-constrained optimization
- Fast methods for computing centroidal Laguerre tessellations for prescribed volume fractions with applications to microstructure generation of polycrystalline materials
- Inertial projection and contraction algorithms with larger step sizes for solving quasimonotone variational inequalities
- The Uzawa-MBB type algorithm for nonsymmetric saddle point problems
- Stable equilibrium configuration of two bar truss by an efficient nonmonotone global Barzilai-Borwein gradient method in a fuzzy environment
Uses Software
This page was built for publication: Minimization algorithms based on supervisor and searcher cooperation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5956435)