Designing Superior Evolutionary Algorithms via Insights From Black-Box Complexity Theory

Abstract : It has been observed that the runtime of randomized search heuristics depend on one or more parameters. A number of results show an advantage of dynamic parameter settings, that is, the parameters of the algorithm are changed during its execution. In this work, we prove that the unary unbiased black-box complexity of the OneMax benchmark function class is $n ln(n) - cn pm o(n)$ for a constant $c$ which is between $0.2539$ and $0.2665$. This runtime can be achieved with a simple (1+1)-type algorithm using a fitness-dependent mutation strength. When translated into the fixed-budget perspective, our algorithm finds solutions which are roughly 13% closer to the optimum than those of the best previously known algorithms.Based on the analyzed optimal mutation strength for OneMax, we show that a self-adjusting choice of the number of bits to be flipped attains the same runtime (apart from $o(n)$ lower-order terms) and the same (asymptotic) 13% fitness-distance improvement over RLS. The adjusting mechanism is to adaptively learn the currently optimal mutation strength from previous iterations. This aims both at exploiting that generally different problems may need different mutation strengths and that for a fixed problem different strengths may become optimal in different stages of the optimization process.We then extend our self-adjusting strategy to population-based evolutionary algorithms in discrete search spaces. Roughly speaking, it consists of creating half the offspring with a mutation rate that is twice the current mutation rate and the other half with half the current rate. The mutation rate is then updated to the rate used in that subpopulation which contains the best offspring. We analyze how the $(1+lambda)$ evolutionary algorithm with this self-adjusting mutation rate optimizes the OneMax test function. We prove that this dynamic version of the $(1+lambda)$~EA finds the optimum in an expected optimization time (number of fitness evaluations) of $O(nlambda/loglambda+nlog n)$. This time is asymptotically smaller than the optimization time of the classic $(1+lambda)$ EA. Previous work shows that this performance is best-possible among all $lambda$-parallel mutation-based unbiased black-box algorithms.We also propose and analyze a self-adaptive version of the $(1,lambda)$ evolutionary algorithm in which the current mutation rate is part of the individual and thus also subject to mutation. A rigorous runtime analysis on the OneMax benchmark function reveals that a simple local mutation scheme for the rate leads to an expected optimization time of the best possible $O(nlambda/loglambda+nlog n)$. Our result shows that self-adaptation in evolutionary computation can find complex optimal parameter settings on the fly. At the same time, it proves that a relatively complicated self-adjusting scheme for the mutation rate can be replaced by our simple endogenous scheme.
Document type :
Theses
Complete list of metadatas

Cited literature [74 references]  Display  Hide  Download

https://pastel.archives-ouvertes.fr/tel-01947767
Contributor : Abes Star <>
Submitted on : Tuesday, September 24, 2019 - 10:02:08 AM
Last modification on : Tuesday, September 24, 2019 - 10:02:08 AM
Long-term archiving on : Friday, March 8, 2019 - 2:09:55 PM

File

73685_YANG_2018_archivage.pdf
Version validated by the jury (STAR)

Identifiers

  • HAL Id : tel-01947767, version 1

Citation

Jing Yang. Designing Superior Evolutionary Algorithms via Insights From Black-Box Complexity Theory. Computational Complexity [cs.CC]. Université Paris-Saclay, 2018. English. ⟨NNT : 2018SACLX054⟩. ⟨tel-01947767⟩

Share

Metrics

Record views

195

Files downloads

75