TY - GEN
T1 - A new analysis method for evolutionary optimization of dynamic and noisy objective functions
AU - Dang-Nhu, Raphaël
AU - Dardinier, Thibault
AU - Doerr, Benjamin
AU - Izacard, Gautier
AU - Nogneng, Dorian
N1 - Publisher Copyright:
© 2018 Copyright held by the owner/author(s).
PY - 2018/7/2
Y1 - 2018/7/2
N2 - Evolutionary algorithms, being problem-independent and randomized heuristics, are generally believed to be robust to dynamic changes and noisy access to the problem instance. We propose a new method to obtain rigorous runtime results for such settings. In contrast to many previous works, our new approach mostly relies on general parameters of the dynamics or the noise models, such as the expected change of the dynamic optimum or the probability to have a dynamic change in one iteration. Consequently, we obtain bounds which are valid for large varieties of such models. Despite this generality, for almost all particular models regarded in the past our bounds are stronger than those given in previous works. As one particular result, we prove that the (1 +) EA can optimize the OneMax benchmark function efficiently despite a constant rate of 1-bit flip noise. For this, a logarithmic size offspring population suffices (the previous-best result required a super-linear value of). Our results suggest that the typical way to find the optimum in such adverse settings is not via a steady approach of the optimum, but rather via an exceptionally fast approach after waiting for a rare phase of low dynamic changes or noise.
AB - Evolutionary algorithms, being problem-independent and randomized heuristics, are generally believed to be robust to dynamic changes and noisy access to the problem instance. We propose a new method to obtain rigorous runtime results for such settings. In contrast to many previous works, our new approach mostly relies on general parameters of the dynamics or the noise models, such as the expected change of the dynamic optimum or the probability to have a dynamic change in one iteration. Consequently, we obtain bounds which are valid for large varieties of such models. Despite this generality, for almost all particular models regarded in the past our bounds are stronger than those given in previous works. As one particular result, we prove that the (1 +) EA can optimize the OneMax benchmark function efficiently despite a constant rate of 1-bit flip noise. For this, a logarithmic size offspring population suffices (the previous-best result required a super-linear value of). Our results suggest that the typical way to find the optimum in such adverse settings is not via a steady approach of the optimum, but rather via an exceptionally fast approach after waiting for a rare phase of low dynamic changes or noise.
KW - Dynamic optimization
KW - Noisy optimization
KW - Runtime analysis
UR - https://www.scopus.com/pages/publications/85050611612
U2 - 10.1145/3205455.3205563
DO - 10.1145/3205455.3205563
M3 - Conference contribution
AN - SCOPUS:85050611612
T3 - GECCO 2018 - Proceedings of the 2018 Genetic and Evolutionary Computation Conference
SP - 1467
EP - 1474
BT - GECCO 2018 - Proceedings of the 2018 Genetic and Evolutionary Computation Conference
PB - Association for Computing Machinery, Inc
T2 - 2018 Genetic and Evolutionary Computation Conference, GECCO 2018
Y2 - 15 July 2018 through 19 July 2018
ER -