Abstract:In the field of evolutionary multi-objective optimization, the approximation of the Pareto front (PF) is achieved by utilizing a collection of representative candidate solutions that exhibit desirable convergence and diversity. Although several multi-objective evolutionary algorithms (MOEAs) have been designed, they still have difficulties in keeping balance between convergence and diversity of population. To better solve multi-objective optimization problems (MOPs), this paper proposes a Two-stage Evolutionary Framework For Multi-objective Optimization (TEMOF). Literally, algorithms are divided into two stages to enhance the search capability of the population. During the initial half of evolutions, parental selection is exclusively conducted from the primary population. Additionally, we not only perform environmental selection on the current population, but we also establish an external archive to store individuals situated on the first PF. Subsequently, in the second stage, parents are randomly chosen either from the population or the archive. In the experiments, one classic MOEA and two state-of-the-art MOEAs are integrated into the framework to form three new algorithms. The experimental results demonstrate the superior and robust performance of the proposed framework across a wide range of MOPs. Besides, the winner among three new algorithms is compared with several existing MOEAs and shows better results. Meanwhile, we conclude the reasons that why the two-stage framework is effect for the existing benchmark functions.