The (1+) evolutionary algorithm with self-adjusting mutation rate
From MaRDI portal
The \((1+\lambda)\) evolutionary algorithm with self-adjusting mutation rate
Abstract: We propose a new way to self-adjust the mutation rate in population-based evolutionary algorithms in discrete search spaces. Roughly speaking, it consists of creating half the offspring with a mutation rate that is twice the current mutation rate and the other half with half the current rate. The mutation rate is then updated to the rate used in that subpopulation which contains the best offspring. We analyze how the evolutionary algorithm with this self-adjusting mutation rate optimizes the OneMax test function. We prove that this dynamic version of the EA finds the optimum in an expected optimization time (number of fitness evaluations) of . This time is asymptotically smaller than the optimization time of the classic EA. Previous work shows that this performance is best-possible among all -parallel mutation-based unbiased black-box algorithms. This result shows that the new way of adjusting the mutation rate can find optimal dynamic parameter values on the fly. Since our adjustment mechanism is simpler than the ones previously used for adjusting the mutation rate and does not have parameters itself, we are optimistic that it will find other applications.
Recommendations
- Runtime analysis for self-adaptive mutation rates
- Mutation rate control in the \((1+\lambda)\) evolutionary algorithm with a self-adjusting lower bound
- Optimal mutation rates for the (1+) EA on OneMax through asymptotically tight drift analysis
- Self-adjusting mutation rates with provably optimal success rules
- The interplay of population size and mutation probability in the (1+ ) EA on OneMax
Cites work
- scientific article; zbMATH DE number 1962832 (Why is no real title available?)
- scientific article; zbMATH DE number 5686753 (Why is no real title available?)
- (1+1) EA on Generalized Dynamic OneMax
- A Remark on Stirling's Formula
- A runtime analysis of simple hyper-heuristics: to mix or not to mix operators
- Adaptive population models for offspring populations and parallel evolutionary algorithms
- An elementary analysis of the probability that a binomial random variable exceeds its expectation
- Analyzing evolutionary algorithms. The computer science perspective.
- Analyzing randomized search heuristics: tools from probability theory
- Automata, Languages and Programming
- Bioinspired computation in combinatorial optimization. Algorithms and their computational complexity
- Black-box Complexity of Parallel Search with Distributed Populations
- Concentrated hitting times of randomized search heuristics with variable drift
- From black-box complexity to designing new genetic algorithms
- Mean, Median and Mode in Binomial Distributions
- Multiplicative drift analysis
- Non-uniform mutation rates for problems with unknown solution lengths
- On the analysis of a dynamic evolutionary algorithm
- On the analysis of the \((1+1)\) evolutionary algorithm
- On the utility of the population size for inversely fitness proportional mutation rates
- Optimal parameter choices via precise black-box analysis
- Optimizing linear functions with the \((1 + \lambda)\) evolutionary algorithm -- different asymptotic runtimes for different instances
- Randomized local search, evolutionary algorithms, and the minimum spanning tree problem
- Runtime analysis for self-adaptive mutation rates
- Solving problems with unknown solution length at almost no extra cost
- The \((1+\lambda)\) evolutionary algorithm with self-adjusting mutation rate
- The interplay of population size and mutation probability in the (1+ ) EA on OneMax
- Theoretical analysis of local search strategies to optimize network communication subject to preserving the total number of links
- Tight bounds for blind search on the integers and the reals
Cited in
(24)- A tight runtime analysis for the \((\mu + \lambda)\) EA
- The \((1+\lambda)\) evolutionary algorithm with self-adjusting mutation rate
- Optimal static and self-adjusting parameter choices for the (1+( , )) genetic algorithm
- Optimal mutation rates for the (1+) EA on OneMax through asymptotically tight drift analysis
- Optimal parameter choices via precise black-box analysis
- Crossover can guarantee exponential speed-ups in evolutionary multi-objective optimisation
- The cost of randomness in evolutionary algorithms: crossover can save random bits
- The interplay of population size and mutation probability in the (1+ ) EA on OneMax
- Self-adjusting mutation rates with provably optimal success rules
- Runtime analysis for self-adaptive mutation rates
- Static and self-adjusting mutation strengths for multi-valued decision variables
- Stagnation detection with randomized local search
- Choosing the right algorithm with hints from complexity theory
- Self-adjusting population sizes for the (1,\( \lambda )\)-EA on monotone functions
- An experimental study of operator choices in the \((1+(\lambda,\lambda))\) genetic algorithm
- Self-adjusting population sizes for non-elitist evolutionary algorithms: why success rates matter
- Mutation rate control in the \((1+\lambda)\) evolutionary algorithm with a self-adjusting lower bound
- Self-adjusting evolutionary algorithms for multimodal optimization
- Hardest monotone functions for evolutionary algorithms
- Self-adjusting offspring population sizes outperform fixed parameters on the Cliff function
- The “One-fifth Rule” with Rollbacks for Self-Adjustment of the Population Size in the (1 + (λ,λ)) Genetic Algorithm
- Stagnation detection meets fast mutation
- Stagnation detection meets fast mutation
- On the impact of the mutation-selection balance on the runtime of evolutionary algorithms
This page was built for publication: The \((1+\lambda)\) evolutionary algorithm with self-adjusting mutation rate
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1725645)