Global optimization through a stochastic perturbation of the Polak-Ribière conjugate gradient method
From MaRDI portal
(Redirected from Publication:508045)
Recommendations
- Global Random Optimization by Simultaneous Perturbation Stochastic Approximation
- Global optimization by random perturbation of the gradient method with a fixed parameter
- A Stochastic Method for Constrained Global Optimization
- Stochastic Methods for Global Optimization
- A stochastic algorithm for constrained global optimization
- Stochastic methods for practical global optimization
- scientific article; zbMATH DE number 3924519
- scientific article; zbMATH DE number 3912117
- scientific article; zbMATH DE number 899666
Cites work
- scientific article; zbMATH DE number 2068079 (Why is no real title available?)
- scientific article; zbMATH DE number 914364 (Why is no real title available?)
- scientific article; zbMATH DE number 3278849 (Why is no real title available?)
- A CARTopt method for bound-constrained global optimization
- A class of gradient unconstrained minimization algorithms with adaptive stepsize
- A conjugate gradient method with descent direction for unconstrained optimization
- A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations
- A modified PRP conjugate gradient method
- A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs
- A numerical evaluation of several stochastic algorithms on selected continuous global optimization test problems
- A survey of nonlinear conjugate gradient methods
- A three-parameter family of nonlinear conjugate gradient methods
- A three-terms Polak-Ribière-Polyak conjugate gradient algorithm for large-scale nonlinear equations
- Algorithm 851
- Benchmarking optimization software with performance profiles.
- Conjugate Directions without Linear Searches
- Conjugate gradient algorithms in nonconvex optimization
- Continuous global optimization through the generation of parametric curves
- Convergence of conjugate gradient methods with constant stepsizes
- Convergence of descent method without line search
- Differential evolution -- a simple and efficient heuristic for global optimization over continuous spaces
- Experimental testing of advanced scatter search designs for global optimization of multimodal functions
- Global convergence of conjugate gradient methods without line search
- Global optimization by random perturbation of the gradient method with a fixed parameter
- Global optimization. Scientific and engineering case studies
- Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems
- Nonlinear Programming
- Speeding up continuous GRASP
Cited in
(10)- A new modified three-term Hestenes-Stiefel conjugate gradient method with sufficient descent property and its global convergence
- A mixed algorithm for smooth global optimization
- A new family of hybrid three-term conjugate gradient methods with applications in image restoration
- Global optimization by random perturbation of the gradient method with a fixed parameter
- Global optimization using diffusion perturbations with large noise intensity
- A hybrid CG algorithm for nonlinear unconstrained optimization with application in image restoration
- A new modified three-term conjugate gradient method with sufficient descent property and its global convergence
- Implementation of reduced gradient with bisection algorithms for non-convex optimization problem via stochastic perturbation
- Using estimated gradients in bound-constrained global optimization
- A deterministic method for continuous global optimization using a dense curve
This page was built for publication: Global optimization through a stochastic perturbation of the Polak-Ribière conjugate gradient method
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q508045)