Abstract:Evolutionary algorithms (EAs) have been widely applied to multi-objective optimization, due to their nature of population-based search. Population update, a key component in multi-objective EAs (MOEAs), is usually performed in a greedy, deterministic manner. However, recent studies have questioned this practice and shown that stochastic population update (SPU), which allows inferior solutions have a chance to be preserved, can help MOEAs jump out of local optima more easily. While introducing randomness in the population update process boosts the exploration of MOEAs, there is a drawback that the population may not always preserve the very best solutions found, thus entailing a large population. Intuitively, a possible solution to this issue is to introduce an archive that stores the best solutions ever found. In this paper, we theoretically show that using an archive can allow a small population and accelerate the search of SPU-based MOEAs substantially. Specifically, we analyze the expected running time of two well-established MOEAs, SMS-EMOA and NSGA-II, with SPU for solving a commonly studied bi-objective problem OneJumpZeroJump, and prove that using an archive can bring (even exponential) speedups. The comparison between SMS-EMOA and NSGA-II also suggests that the $(\mu+\mu)$ update mode may be more suitable for SPU than the $(\mu+1)$ update mode. Furthermore, our derived running time bounds for using SPU alone are significantly tighter than previously known ones. Our theoretical findings are also empirically validated on a well-known practical problem, the multi-objective traveling salesperson problem. We hope this work may provide theoretical support to explore different ideas of designing algorithms in evolutionary multi-objective optimization.
Abstract:Elitism, which constructs the new population by preserving best solutions out of the old population and newly-generated solutions, has been a default way for population update since its introduction into multi-objective evolutionary algorithms (MOEAs) in the late 1990s. In this paper, we take an opposite perspective to conduct the population update in MOEAs by simply discarding elitism. That is, we treat the newly-generated solutions as the new population directly (so that all selection pressure comes from mating selection). We propose a simple non-elitist MOEA (called NE-MOEA) that only uses Pareto dominance sorting to compare solutions, without involving any diversity-related selection criterion. Preliminary experimental results show that NE-MOEA can compete with well-known elitist MOEAs (NSGA-II, SMS-EMOA and NSGA-III) on several combinatorial problems. Lastly, we discuss limitations of the proposed non-elitist algorithm and suggest possible future research directions.