Optimizing monotone functions can be difficult

Benjamin Doerr, Thomas Jansen, Dirk Sudholt, Carola Winzen, Christine Zarges

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Extending previous analyses on function classes like linear functions, we analyze how the simple (1+1) evolutionary algorithm optimizes pseudo-Boolean functions that are strictly monotone. Contrary to what one would expect, not all of these functions are easy to optimize. The choice of the constant c in the mutation probability p(n) = c/n can make a decisive difference. We show that if c < 1, then the (1+1) EA finds the optimum of every such function in Θ(n logn) iterations. For c = 1, we can still prove an upper bound of O(n3/2). However, for c > 33, we present a strictly monotone function such that the (1+1) EA with overwhelming probability does not find the optimum within 2Ω(n) iterations. This is the first time that we observe that a constant factor change of the mutation probability changes the run-time by more than constant factors.

Original languageEnglish
Title of host publicationParallel Problem Solving from Nature, PPSN XI - 11th International Conference, Proceedings
Pages42-51
Number of pages10
EditionPART 1
DOIs
Publication statusPublished - 12 Nov 2010
Externally publishedYes
Event11th International Conference on Parallel Problem Solving from Nature, PPSN 2010 - Krakow, Poland
Duration: 11 Sept 201015 Sept 2010

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
NumberPART 1
Volume6238 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference11th International Conference on Parallel Problem Solving from Nature, PPSN 2010
Country/TerritoryPoland
CityKrakow
Period11/09/1015/09/10

Fingerprint

Dive into the research topics of 'Optimizing monotone functions can be difficult'. Together they form a unique fingerprint.

Cite this