Abstract:Electric machine design optimization is a computationally expensive multi-objective optimization problem. While the objectives require time-consuming finite element analysis, optimization constraints can often be based on mathematical expressions, such as geometric constraints. This article investigates this optimization problem of mixed computationally expensive nature by proposing an optimization method incorporated into a popularly-used evolutionary multi-objective optimization algorithm - NSGA-II. The proposed method exploits the inexpensiveness of geometric constraints to generate feasible designs by using a custom repair operator. The proposed method also addresses the time-consuming objective functions by incorporating surrogate models for predicting machine performance. The article successfully establishes the superiority of the proposed method over the conventional optimization approach. This study clearly demonstrates how a complex engineering design can be optimized for multiple objectives and constraints requiring heterogeneous evaluation times and optimal solutions can be analyzed to select a single preferred solution and importantly harnessed to reveal vital design features common to optimal solutions as design principles.
Abstract:Significant effort has been made to solve computationally expensive optimization problems in the past two decades, and various optimization methods incorporating surrogates into optimization have been proposed. However, most optimization toolboxes do not consist of ready-to-run algorithms for computationally expensive problems, especially in combination with other key requirements, such as handling multiple conflicting objectives or constraints. Thus, the lack of appropriate software packages has become a bottleneck for solving real-world applications. The proposed framework, pysamoo, addresses these shortcomings of existing optimization frameworks and provides multiple optimization methods for handling problems involving time-consuming evaluation functions. The framework extends the functionalities of pymoo, a popular and comprehensive toolbox for multi-objective optimization, and incorporates surrogates to support expensive function evaluations. The framework is available under the GNU Affero General Public License (AGPL) and is primarily designed for research purposes. For more information about pysamoo, readers are encouraged to visit: anyoptimization.com/projects/pysamoo.
Abstract:Significant effort has been made to solve computationally expensive optimization problems in the past two decades, and various optimization methods incorporating surrogates into optimization have been proposed. Most research focuses on either exploiting the surrogate by defining a utility optimization problem or customizing an existing optimization method to use one or multiple approximation models. However, only a little attention has been paid to generic concepts applicable to different types of algorithms and optimization problems simultaneously. Thus this paper proposes a generalized probabilistic surrogate-assisted framework (GPSAF), applicable to a broad category of unconstrained and constrained, single- and multi-objective optimization algorithms. The idea is based on a surrogate assisting an existing optimization method. The assistance is based on two distinct phases, one facilitating exploration and another exploiting the surrogates. The exploration and exploitation of surrogates are automatically balanced by performing a probabilistic knockout tournament among different clusters of solutions. A study of multiple well-known population-based optimization algorithms is conducted with and without the proposed surrogate assistance on single- and multi-objective optimization problems with a maximum solution evaluation budget of 300 or less. The results indicate the effectiveness of applying GPSAF to an optimization algorithm and the competitiveness with other surrogate-assisted algorithms.
Abstract:In this paper, we propose a method to solve a bi-objective variant of the well-studied Traveling Thief Problem (TTP). The TTP is a multi-component problem that combines two classic combinatorial problems: Traveling Salesman Problem (TSP) and Knapsack Problem (KP). In the TTP, a thief has to visit each city exactly once and can pick items throughout their journey. The thief begins their journey with an empty knapsack and travels with a speed inversely proportional to the knapsack weight. We address the BI-TTP, a bi-objective version of the TTP, where the goal is to minimize the overall traveling time and to maximize the profit of the collected items. Our method is based on a genetic algorithm with customization addressing problem characteristics. We incorporate domain knowledge through a combination of near-optimal solutions of each subproblem in the initial population and a custom repair operation to avoid the evaluation of infeasible solutions. Moreover, the independent variables of the TSP and KP components are unified to a real variable representation by using a biased random-key approach. The bi-objective aspect of the problem is addressed through an elite population extracted based on the non-dominated rank and crowding distance of each solution. Furthermore, we provide a comprehensive study which shows the influence of hyperparameters on the performance of our method and investigate the performance of each hyperparameter combination over time. In addition to our experiments, we discuss the results of the BI-TTP competitions at EMO-2019 and GECCO-2019 conferences where our method has won first and second place, respectively, thus proving its ability to find high-quality solutions consistently.
Abstract:Python has become the programming language of choice for research and industry projects related to data science, machine learning, and deep learning. Since optimization is an inherent part of these research fields, more optimization related frameworks have arisen in the past few years. Only a few of them support optimization of multiple conflicting objectives at a time, but do not provide comprehensive tools for a complete multi-objective optimization task. To address this issue, we have developed pymoo, a multi-objective optimization framework in Python. We provide a guide to getting started with our framework by demonstrating the implementation of an exemplary constrained multi-objective optimization scenario. Moreover, we give a high-level overview of the architecture of pymoo to show its capabilities followed by an explanation of each module and its corresponding sub-modules. The implementations in our framework are customizable and algorithms can be modified/extended by supplying custom operators. Moreover, a variety of single, multi and many-objective test problems are provided and gradients can be retrieved by automatic differentiation out of the box. Also, pymoo addresses practical needs, such as the parallelization of function evaluations, methods to visualize low and high-dimensional spaces, and tools for multi-criteria decision making. For more information about pymoo, readers are encouraged to visit: https://pymoo.org