Stochastic multiple chaotic local search-incorporated gradient-based optimizer (Q2065423): Difference between revisions

From MaRDI portal
Set OpenAlex properties.
ReferenceBot (talk | contribs)
Changed an Item
 
Property / cites work
 
Property / cites work: Differential evolution -- a simple and efficient heuristic for global optimization over continuous spaces / rank
 
Normal rank
Property / cites work
 
Property / cites work: A hybrid evolution strategy for the open vehicle routing problem / rank
 
Normal rank
Property / cites work
 
Property / cites work: A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm / rank
 
Normal rank
Property / cites work
 
Property / cites work: GSA: A gravitational search algorithm / rank
 
Normal rank
Property / cites work
 
Property / cites work: Balancing exploration and exploitation in multiobjective evolutionary optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Exploration and exploitation in evolutionary algorithms / rank
 
Normal rank
Property / cites work
 
Property / cites work: Gravitational search algorithm combined with chaos for unconstrained numerical optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Gradient-based optimizer: a new metaheuristic optimization algorithm / rank
 
Normal rank
Property / cites work
 
Property / cites work: Fundamentals of synchronization in chaotic systems, concepts, and applications / rank
 
Normal rank
Property / cites work
 
Property / cites work: Gaussian MAP Filtering Using Kalman Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: A new neuron model under electromagnetic field / rank
 
Normal rank

Latest revision as of 16:04, 27 July 2024

scientific article
Language Label Description Also known as
English
Stochastic multiple chaotic local search-incorporated gradient-based optimizer
scientific article

    Statements

    Stochastic multiple chaotic local search-incorporated gradient-based optimizer (English)
    0 references
    0 references
    7 January 2022
    0 references
    Summary: In this study, a hybrid metaheuristic algorithm chaotic gradient-based optimizer (CGBO) is proposed. The gradient-based optimizer (GBO) is a novel metaheuristic inspired by Newton's method which has two search strategies to ensure excellent performance. One is the gradient search rule (GSR), and the other is local escaping operation (LEO). GSR utilizes the gradient method to enhance ability of exploitation and convergence rate, and LEO employs random operators to escape the local optima. It is verified that gradient-based metaheuristic algorithms have obvious shortcomings in exploration. Meanwhile, chaotic local search (CLS) is an efficient search strategy with randomicity and ergodicity, which is usually used to improve global optimization algorithms. Accordingly, we incorporate GBO with CLS to strengthen the ability of exploration and keep high-level population diversity for original GBO. In this study, CGBO is tested with over 30 CEC2017 benchmark functions and a parameter optimization problem of the dendritic neuron model (DNM). Experimental results indicate that CGBO performs better than other state-of-the-art algorithms in terms of effectiveness and robustness.
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references