Stagnation detection with randomized local search
From MaRDI portal
Abstract: Recently a mechanism called stagnation detection was proposed that automatically adjusts the mutation rate of evolutionary algorithms when they encounter local optima. The so-called introduced by Rajabi and Witt (GECCO 2020) adds stagnation detection to the classical with standard bit mutation, which flips each bit independently with some mutation rate, and raises the mutation rate when the algorithm is likely to have encountered local optima. In this paper, we investigate stagnation detection in the context of the -bit flip operator of randomized local search that flips bits chosen uniformly at random and let stagnation detection adjust the parameter . We obtain improved runtime results compared to the amounting to a speed-up of up to Moreover, we propose additional schemes that prevent infinite optimization times even if the algorithm misses a working choice of due to unlucky events. Finally, we present an example where standard bit mutation still outperforms the local -bit flip with stagnation detection.
Recommendations
- Principles of Stochastic Local Search
- A stochastic local search algorithm for constrained continuous global optimization
- Detection and Remediation of Stagnation in the Nelder--Mead Algorithm Using a Sufficient Decrease Condition
- Stopping and restarting strategy for stochastic sequential search in global optimization
- Stagnation detection meets fast mutation
- Stagnation detection meets fast mutation
- Sequential Stopping Rules for Random Optimization Methods with Applications to Multistart Local Search
- Stochastic local search. Foundations and applications.
Cites work
- Optimal static and self-adjusting parameter choices for the (1+( , )) genetic algorithm
- Randomized local search, evolutionary algorithms, and the minimum spanning tree problem
- The \((1+\lambda)\) evolutionary algorithm with self-adjusting mutation rate
- The impact of random initialization on the runtime of randomized search heuristics
- Theory of evolutionary computation. Recent developments in discrete optimization
Cited in
(10)- A rigorous runtime analysis of the \((1 + (\lambda, \lambda))\) GA on jump functions
- Lazy parameter tuning and control: choosing all parameters randomly from a power-law distribution
- Lower bounds from fitness levels made easy
- An extended jump functions benchmark for the analysis of randomized search heuristics
- Self-adjusting evolutionary algorithms for multimodal optimization
- When move acceptance selection hyper-heuristics outperform metropolis and elitist evolutionary algorithms and when not
- Do additional target points speed up evolutionary algorithms?
- How majority-vote crossover and estimation-of-distribution algorithms cope with fitness valleys
- Stagnation detection meets fast mutation
- Stagnation detection meets fast mutation
This page was built for publication: Stagnation detection with randomized local search
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2233520)