CGRS -- an advanced hybrid method for global optimization of continuous functions closely coupling extended random search and conjugate gradient method
From MaRDI portal
Publication:679580
Recommendations
- Continuous GRASP with a local active-set method for bound-constrained global optimization
- Global optimization by continuous grasp
- Two descent hybrid conjugate gradient methods for optimization
- Genetic and Nelder--Mead algorithms hybridized for a more accurate global optimization of continuous multiminima functions.
- Global convergence of a hybrid conjugate gradient method
Cites work
- scientific article; zbMATH DE number 1728811 (Why is no real title available?)
- scientific article; zbMATH DE number 50672 (Why is no real title available?)
- scientific article; zbMATH DE number 3526471 (Why is no real title available?)
- scientific article; zbMATH DE number 1215248 (Why is no real title available?)
- scientific article; zbMATH DE number 1356716 (Why is no real title available?)
- scientific article; zbMATH DE number 1356719 (Why is no real title available?)
- scientific article; zbMATH DE number 2113770 (Why is no real title available?)
- scientific article; zbMATH DE number 775283 (Why is no real title available?)
- scientific article; zbMATH DE number 851215 (Why is no real title available?)
- scientific article; zbMATH DE number 3274494 (Why is no real title available?)
- scientific article; zbMATH DE number 3278849 (Why is no real title available?)
- scientific article; zbMATH DE number 961607 (Why is no real title available?)
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- A New Method of Constrained Optimization and a Comparison With Other Methods
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A class of globally convergent conjugate gradient methods
- A collection of test problems for constrained global optimization algorithms
- A combined global \& local search (CGLS) approach to global optimization
- A hybrid algorithm for identifying global and local minima when optimizing functions with many minima.
- A hybrid descent method for global optimization
- A hybrid evolutionary algorithm for global optimization
- A new family of conjugate gradient methods
- A novel hybrid algorithm based on particle swarm and ant colony optimization for finding the global minimum
- A particle swarm pattern search method for bound constrained global optimization
- A review of recent advances in global optimization
- A survey of nonlinear conjugate gradient methods
- Adaptive Numerical Differentiation
- Algorithm 851
- An efficient algorithm for large scale global optimization of continuous functions
- Computing Forward-Difference Intervals for Numerical Optimization
- Conjugate gradient algorithms in nonconvex optimization
- Convergence Conditions for Ascent Methods
- Convergence guarantees for generalized adaptive stochastic search methods for continuous global optimization
- DESA: a new hybrid global optimization method and its application to analog integrated circuit sizing
- Derivative-free optimization: a review of algorithms and comparison of software implementations
- Evaluating Derivatives
- Evolution strategies. A comprehensive introduction
- Function minimization by conjugate gradients
- Global convergence of general derivative-free trust-region algorithms to first- and second-order critical points
- Global convergence of the DY conjugate gradient method with Armijo line search for unconstrained optimization problems
- Global optimization based on local searches
- Handbook of global optimization. Vol. 2
- Hybrid simulated annealing and direct search method for nonlinear unconstrained global optimization
- Improving hit-and-run for global optimization
- Introduction to Derivative-Free Optimization
- Introduction to Stochastic Search and Optimization
- Methods of conjugate gradients for solving linear systems
- Minimization by Random Search Techniques
- Note on the Convergence of Simulated Annealing Algorithms
- Numerical Differentiation of Analytic Functions
- On the Convergence and Applications of Generalized Simulated Annealing
- On the Convergence of Pattern Search Algorithms
- On trust region methods for unconstrained minimization without derivatives
- Optimization by simulated annealing
- Optimization. Algorithms and consistent approximations
- Spectral conjugate gradient methods with sufficient descent property for large-scale unconstrained optimization
- Stochastic global optimization methods part II: Multi level methods
- The Limited Memory Conjugate Gradient Method
- The Theory and Practice of Simulated Annealing
- Trust-region methods without using derivatives: worst case complexity and the nonsmooth case
Cited in
(2)
This page was built for publication: CGRS -- an advanced hybrid method for global optimization of continuous functions closely coupling extended random search and conjugate gradient method
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q679580)