Gradient-based cuckoo search for global optimization (Q1718536): Difference between revisions

From MaRDI portal
Added link to MaRDI item.
ReferenceBot (talk | contribs)
Changed an Item
 
(4 intermediate revisions by 4 users not shown)
Property / Wikidata QID
 
Property / Wikidata QID: Q59067828 / rank
 
Normal rank
Property / describes a project that uses
 
Property / describes a project that uses: MultiMin / rank
 
Normal rank
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1155/2014/493740 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W1995335159 / rank
 
Normal rank
Property / cites work
 
Property / cites work: A review of recent advances in global optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Derivative-free optimization: a review of algorithms and comparison of software implementations / rank
 
Normal rank
Property / cites work
 
Property / cites work: Engineering optimisation by cuckoo search / rank
 
Normal rank

Latest revision as of 02:28, 18 July 2024

scientific article
Language Label Description Also known as
English
Gradient-based cuckoo search for global optimization
scientific article

    Statements

    Gradient-based cuckoo search for global optimization (English)
    0 references
    8 February 2019
    0 references
    Summary: One of the major advantages of stochastic global optimization methods is the lack of the need of the gradient of the objective function. However, in some cases, this gradient is readily available and can be used to improve the numerical performance of stochastic optimization methods specially the quality and precision of global optimal solution. In this study, we proposed a gradient-based modification to the cuckoo search algorithm, which is a nature-inspired swarm-based stochastic global optimization method. We introduced the gradient-based cuckoo search (GBCS) and evaluated its performance vis-à-vis the original algorithm in solving twenty-four benchmark functions. The use of GBCS improved reliability and effectiveness of the algorithm in all but four of the tested benchmark problems. GBCS proved to be a strong candidate for solving difficult optimization problems, for which the gradient of the objective function is readily available.
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references