TY - GEN
T1 - Theoretical Analyses of Multi-Objective Evolutionary Algorithms on Multi-Modal Objectives
AU - Doerr, Benjamin
AU - Zheng, Weijie
N1 - Publisher Copyright:
© 2021, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rights reserved.
PY - 2021/1/1
Y1 - 2021/1/1
N2 - Previous theory work on multi-objective evolutionary algorithms considers mostly easy problems that are composed of unimodal objectives. This paper takes a first step towards a deeper understanding of how evolutionary algorithms solve multi-modal multi-objective problems. We propose the ONEJUMPZEROJUMP problem, a bi-objective problem whose single objectives are isomorphic to the classic jump functions benchmark. We prove that the simple evolutionary multi-objective optimizer (SEMO) cannot compute the full Pareto front. In contrast, for all problem sizes n and all jump sizes k ∈ [4..n/2 - 1], the global SEMO (GSEMO) covers the Pareto front in T((n - 2k)nk) iterations in expectation. To improve the performance, we combine the GSEMO with two approaches, a heavy-tailed mutation operator and a stagnation detection strategy, that showed advantages in singleobjective multi-modal problems. Runtime improvements of asymptotic order at least kω(k) are shown for both strategies. Our experiments verify the substantial runtime gains already for moderate problem sizes. Overall, these results show that the ideas recently developed for single-objective evolutionary algorithms can be effectively employed also in multiobjective optimization.
AB - Previous theory work on multi-objective evolutionary algorithms considers mostly easy problems that are composed of unimodal objectives. This paper takes a first step towards a deeper understanding of how evolutionary algorithms solve multi-modal multi-objective problems. We propose the ONEJUMPZEROJUMP problem, a bi-objective problem whose single objectives are isomorphic to the classic jump functions benchmark. We prove that the simple evolutionary multi-objective optimizer (SEMO) cannot compute the full Pareto front. In contrast, for all problem sizes n and all jump sizes k ∈ [4..n/2 - 1], the global SEMO (GSEMO) covers the Pareto front in T((n - 2k)nk) iterations in expectation. To improve the performance, we combine the GSEMO with two approaches, a heavy-tailed mutation operator and a stagnation detection strategy, that showed advantages in singleobjective multi-modal problems. Runtime improvements of asymptotic order at least kω(k) are shown for both strategies. Our experiments verify the substantial runtime gains already for moderate problem sizes. Overall, these results show that the ideas recently developed for single-objective evolutionary algorithms can be effectively employed also in multiobjective optimization.
U2 - 10.1609/aaai.v35i14.17459
DO - 10.1609/aaai.v35i14.17459
M3 - Conference contribution
AN - SCOPUS:85128941592
T3 - 35th AAAI Conference on Artificial Intelligence, AAAI 2021
SP - 12293
EP - 12301
BT - 35th AAAI Conference on Artificial Intelligence, AAAI 2021
PB - Association for the Advancement of Artificial Intelligence
T2 - 35th AAAI Conference on Artificial Intelligence, AAAI 2021
Y2 - 2 February 2021 through 9 February 2021
ER -