Robustness of populations in stochastic environments (Q306488): Difference between revisions
From MaRDI portal
Added link to MaRDI item. |
Created claim: MaRDI profile type (P1460): MaRDI publication profile (Q5976449), #quickstatements; #temporary_batch_1710461151948 |
||
Property / MaRDI profile type | |||
Property / MaRDI profile type: MaRDI publication profile / rank | |||
Normal rank |
Revision as of 01:37, 15 March 2024
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Robustness of populations in stochastic environments |
scientific article |
Statements
Robustness of populations in stochastic environments (English)
0 references
31 August 2016
0 references
The efficiency of evolutionary algorithms (EAs) depends heavily on the exact set-up. The authors study in detail how they behave under noise. Concretely, this means that the fitness function, which measures the fitness and hence the strength of an element in the algorithm, is given only with some uncertainty. One important aspect of EAs is the size of the population, since in the algorithm the fitness of all members must be evaluated, hence it is advantageous to keep the population size small without sacrificing convergence. The authors study this for the learning of two well-known functions LeadingOnes (a bit function which starts with ones) and OneMax (a bit function that counts the numbers of ones, irrespective of their positions). Lower and upper bounds are given mathematically in different noise settings (e.g., varying distributions). One of the results of the paper is that the LeadingOnes function is much more sensitive to prior noise than the OneMax function. Just to give a flavour of the type of results: If the sampling is from a Gaussian distribution and if the variance is less than the inverse of a logarithmic expression of \(n\) then optimization happens in polynomial time; if, however, it is bigger (multiplied by a suitable constant) then optimization takes superpolynomial time. Detailed proofs are given, which will be particularly interesting for those who want to transfer the techniques to other problems. The paper consists of 25 theorems and corollaries, which are difficult to summarize, the main result according to the authors themselves states in the paper ``that populations are necessary for successful optimization for any substantial noise levels. The surprising result is that even very small populations (of size logarithmic in the problem size) already lead to very high robustness to noise.''
0 references
runtime analysis
0 references
stochastic fitness function
0 references
evolutionary algorithm
0 references
population size
0 references
robustness
0 references