Theory of Evolutionary Computation

From MaRDI portal
Publication:5241813

DOI10.1007/978-3-030-29414-4zbMath1429.68004OpenAlexW4240139199MaRDI QIDQ5241813

No author found.

Publication date: 1 November 2019

Published in: Natural Computing Series (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1007/978-3-030-29414-4




Related Items

Tail bounds on hitting times of randomized search heuristics using variable drift analysisA rigorous runtime analysis of the \((1 + (\lambda, \lambda))\) GA on jump functionsTight bounds on the expected runtime of a standard steady state genetic algorithmDoes comma selection help to cope with local optima?Self-adjusting evolutionary algorithms for multimodal optimizationFast mutation in crossover-based algorithmsFixed-target runtime analysisThe “One-fifth Rule” with Rollbacks for Self-Adjustment of the Population Size in the (1 + (λ,λ)) Genetic AlgorithmRuntime analysis for self-adaptive mutation ratesA tight runtime analysis for the \((\mu + \lambda)\) EAReversible random walks on dynamic graphsWhen move acceptance selection hyper-heuristics outperform metropolis and elitist evolutionary algorithms and when notMulti-objective evolutionary algorithms are generally good: maximizing monotone submodular functions over sequencesFirst Steps Towards a Runtime Analysis of NeuroevolutionRuntime Analysis of a Co-Evolutionary AlgorithmOneMax is not the easiest function for fitness improvementsThe cost of randomness in evolutionary algorithms: crossover can save random bits(1+1) genetic programming with functionally complete instruction sets can evolve Boolean conjunctions and disjunctions with arbitrarily small errorUnnamed ItemBivariate estimation-of-distribution algorithms can find an exponential number of optimaExact Markov chain-based runtime analysis of a discrete particle swarm optimization algorithm on sorting and OneMaxLower bounds from fitness levels made easyLazy parameter tuning and control: choosing all parameters randomly from a power-law distributionSelf-adjusting population sizes for non-elitist evolutionary algorithms: why success rates matterFocused jump-and-repair constraint handling for fixed-parameter tractable graph problems closed under induced subgraphsAn extended jump functions benchmark for the analysis of randomized search heuristicsSimulated annealing is a polynomial-time approximation scheme for the minimum spanning tree problemRuntime analysis for permutation-based evolutionary algorithmsThe average distance and the diameter of dense random regular graphsDo additional target points speed up evolutionary algorithms?Curing ill-Conditionality via Representation-Agnostic Distance-Driven PerturbationsFast Convergence of k-Opinion Undecided State Dynamics in the Population Protocol ModelHow majority-vote crossover and estimation-of-distribution algorithms cope with fitness valleysOn the benefits of populations for the exploitation speed of standard steady-state genetic algorithmsExponential upper bounds for the runtime of randomized search heuristicsA simplified run time analysis of the univariate marginal distribution algorithm on LeadingOnesRuntime analysis of evolutionary algorithms via symmetry argumentsStagnation detection with randomized local searchOn negative dependence properties of Latin hypercube samples and scrambled netsMultiplicative up-driftThe runtime of the compact genetic algorithm on jump functionsSelf-adjusting mutation rates with provably optimal success rulesTime complexity analysis of randomized search heuristics for the dynamic graph coloring problemStagnation detection meets fast mutationMutation Rate Control in the $$(1+\lambda )$$ Evolutionary Algorithm with a Self-adjusting Lower Bound




This page was built for publication: Theory of Evolutionary Computation