Gradient-based cuckoo search for global optimization (Q1718536): Difference between revisions
From MaRDI portal
Set profile property. |
ReferenceBot (talk | contribs) Changed an Item |
||
(One intermediate revision by one other user not shown) | |||
Property / full work available at URL | |||
Property / full work available at URL: https://doi.org/10.1155/2014/493740 / rank | |||
Normal rank | |||
Property / OpenAlex ID | |||
Property / OpenAlex ID: W1995335159 / rank | |||
Normal rank | |||
Property / cites work | |||
Property / cites work: A review of recent advances in global optimization / rank | |||
Normal rank | |||
Property / cites work | |||
Property / cites work: Derivative-free optimization: a review of algorithms and comparison of software implementations / rank | |||
Normal rank | |||
Property / cites work | |||
Property / cites work: Engineering optimisation by cuckoo search / rank | |||
Normal rank |
Latest revision as of 02:28, 18 July 2024
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Gradient-based cuckoo search for global optimization |
scientific article |
Statements
Gradient-based cuckoo search for global optimization (English)
0 references
8 February 2019
0 references
Summary: One of the major advantages of stochastic global optimization methods is the lack of the need of the gradient of the objective function. However, in some cases, this gradient is readily available and can be used to improve the numerical performance of stochastic optimization methods specially the quality and precision of global optimal solution. In this study, we proposed a gradient-based modification to the cuckoo search algorithm, which is a nature-inspired swarm-based stochastic global optimization method. We introduced the gradient-based cuckoo search (GBCS) and evaluated its performance vis-à-vis the original algorithm in solving twenty-four benchmark functions. The use of GBCS improved reliability and effectiveness of the algorithm in all but four of the tested benchmark problems. GBCS proved to be a strong candidate for solving difficult optimization problems, for which the gradient of the objective function is readily available.
0 references