Abstract:Variants of the GSEMO algorithm using multi-objective formulations have been successfully analyzed and applied to optimize chance-constrained submodular functions. However, due to the effect of the increasing population size of the GSEMO algorithm considered in these studies from the algorithms, the approach becomes ineffective if the number of trade-offs obtained grows quickly during the optimization run. In this paper, we apply the sliding-selection approach introduced in [21] to the optimization of chance-constrained monotone submodular functions. We theoretically analyze the resulting SW-GSEMO algorithm which successfully limits the population size as a key factor that impacts the runtime and show that this allows it to obtain better runtime guarantees than the best ones currently known for the GSEMO. In our experimental study, we compare the performance of the SW-GSEMO to the GSEMO and NSGA-II on the maximum coverage problem under the chance constraint and show that the SW-GSEMO outperforms the other two approaches in most cases. In order to get additional insights into the optimization behavior of SW-GSEMO, we visualize the selection behavior of SW-GSEMO during its optimization process and show it beats other algorithms to obtain the highest quality of solution in variable instances.