Abstract:We present two novel domain-independent genetic operators that harness the capabilities of deep learning: a crossover operator for genetic algorithms and a mutation operator for genetic programming. Deep Neural Crossover leverages the capabilities of deep reinforcement learning and an encoder-decoder architecture to select offspring genes. BERT mutation masks multiple gp-tree nodes and then tries to replace these masks with nodes that will most likely improve the individual's fitness. We show the efficacy of both operators through experimentation.
Abstract:We present a novel multi-parent crossover operator in genetic algorithms (GAs) called ``Deep Neural Crossover'' (DNC). Unlike conventional GA crossover operators that rely on a random selection of parental genes, DNC leverages the capabilities of deep reinforcement learning (DRL) and an encoder-decoder architecture to select the genes. Specifically, we use DRL to learn a policy for selecting promising genes. The policy is stochastic, to maintain the stochastic nature of GAs, representing a distribution for selecting genes with a higher probability of improving fitness. Our architecture features a recurrent neural network (RNN) to encode the parental genomes into latent memory states, and a decoder RNN that utilizes an attention-based pointing mechanism to generate a distribution over the next selected gene in the offspring. To improve the training time, we present a pre-training approach, wherein the architecture is initially trained on a single problem within a specific domain and then applied to solving other problems of the same domain. We compare DNC to known operators from the literature over two benchmark domains -- bin packing and graph coloring. We compare with both two- and three-parent crossover, outperforming all baselines. DNC is domain-independent and can be easily applied to other problem domains.