First Steps Toward a Runtime Analysis When Starting With a Good Solution

Research output: Contribution to journalArticlepeer-review

Abstract

The mathematical runtime analysis of evolutionary algorithms traditionally regards the time an algorithm needs to find a solution of a certain quality when initialized with a random population. In practical applications it may be possible to guess solutions that are better than random ones. We start a mathematical runtime analysis for such situations. We observe that different algorithms profit to a very different degree from a better initialization. We also show that the optimal parameterization of an algorithm can depend strongly on the quality of the initial solutions. To overcome this difficulty, self-adjusting and randomized heavy-tailed parameter choices can be profitable. Finally, we observe a larger gap between the performance of the best evolutionary algorithm we found and the corresponding black-box complexity. This could suggest that evolutionary algorithms better exploiting good initial solutions are still to be found. These first findings stem from analyzing the performance of the evolutionary algorithm and the static, self-adjusting, and heavy-tailed genetic algorithms on the OneMax benchmark. We are optimistic that the question of how to profit from good initial solutions is interesting beyond these first examples.

Original languageEnglish
Article number14
JournalACM Transactions on Evolutionary Learning and Optimization
Volume5
Issue number2
DOIs
Publication statusPublished - 15 May 2025

Keywords

  • Runtime analysis
  • black-box complexity
  • genetic algorithms
  • initialization
  • reoptimization

Fingerprint

Dive into the research topics of 'First Steps Toward a Runtime Analysis When Starting With a Good Solution'. Together they form a unique fingerprint.

Cite this