CGRS -- an advanced hybrid method for global optimization of continuous functions closely coupling extended random search and conjugate gradient method
DOI10.1016/J.CAM.2017.10.018zbMATH Open1390.90442OpenAlexW2765173154MaRDI QIDQ679580FDOQ679580
Authors: Christian Gnandt, R. Callies
Publication date: 11 January 2018
Published in: Journal of Computational and Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.cam.2017.10.018
Recommendations
- Continuous GRASP with a local active-set method for bound-constrained global optimization
- Global optimization by continuous grasp
- Two descent hybrid conjugate gradient methods for optimization
- Genetic and Nelder--Mead algorithms hybridized for a more accurate global optimization of continuous multiminima functions.
- Global convergence of a hybrid conjugate gradient method
global optimizationconjugate gradient methodconvergence in probabilityhybrid approachrandom searchdistribution-based region control
Approximation methods and heuristics in mathematical programming (90C59) Nonconvex programming, global optimization (90C26)
Cites Work
- Algorithm 851
- Stochastic global optimization methods part II: Multi level methods
- Optimization by simulated annealing
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Function minimization by conjugate gradients
- Title not available (Why is that?)
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Convergence Conditions for Ascent Methods
- Methods of conjugate gradients for solving linear systems
- Introduction to Stochastic Search and Optimization
- Optimization. Algorithms and consistent approximations
- Evaluating Derivatives
- Spectral conjugate gradient methods with sufficient descent property for large-scale unconstrained optimization
- Title not available (Why is that?)
- A collection of test problems for constrained global optimization algorithms
- Handbook of global optimization. Vol. 2
- Evolution strategies. A comprehensive introduction
- A particle swarm pattern search method for bound constrained global optimization
- Global convergence of general derivative-free trust-region algorithms to first- and second-order critical points
- On the Convergence of Pattern Search Algorithms
- Introduction to Derivative-Free Optimization
- A survey of nonlinear conjugate gradient methods
- A review of recent advances in global optimization
- Title not available (Why is that?)
- Title not available (Why is that?)
- The Limited Memory Conjugate Gradient Method
- Derivative-free optimization: a review of algorithms and comparison of software implementations
- A combined global \& local search (CGLS) approach to global optimization
- Minimization by Random Search Techniques
- The Theory and Practice of Simulated Annealing
- A New Method of Constrained Optimization and a Comparison With Other Methods
- Convergence guarantees for generalized adaptive stochastic search methods for continuous global optimization
- Hybrid simulated annealing and direct search method for nonlinear unconstrained global optimization
- Title not available (Why is that?)
- A class of globally convergent conjugate gradient methods
- A hybrid descent method for global optimization
- A new family of conjugate gradient methods
- Conjugate gradient algorithms in nonconvex optimization
- On trust region methods for unconstrained minimization without derivatives
- Improving hit-and-run for global optimization
- Note on the Convergence of Simulated Annealing Algorithms
- Numerical Differentiation of Analytic Functions
- A hybrid evolutionary algorithm for global optimization
- Global optimization based on local searches
- Title not available (Why is that?)
- DESA: a new hybrid global optimization method and its application to analog integrated circuit sizing
- On the Convergence and Applications of Generalized Simulated Annealing
- Computing Forward-Difference Intervals for Numerical Optimization
- An efficient algorithm for large scale global optimization of continuous functions
- Title not available (Why is that?)
- A hybrid algorithm for identifying global and local minima when optimizing functions with many minima.
- A novel hybrid algorithm based on particle swarm and ant colony optimization for finding the global minimum
- Title not available (Why is that?)
- Trust-region methods without using derivatives: worst case complexity and the nonsmooth case
- Adaptive Numerical Differentiation
- Title not available (Why is that?)
- Global convergence of the DY conjugate gradient method with Armijo line search for unconstrained optimization problems
Cited In (2)
Uses Software
This page was built for publication: CGRS -- an advanced hybrid method for global optimization of continuous functions closely coupling extended random search and conjugate gradient method
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q679580)