Abstract:We show that evolutionary computation can be implemented as standard Markov-chain Monte-Carlo (MCMC) sampling. With some care, `genetic algorithms' can be constructed that are reversible Markov chains that satisfy detailed balance; it follows that the stationary distribution of populations is a Gibbs distribution in a simple factorised form. For some standard and popular nonparametric probability models, we exhibit Gibbs-sampling procedures that are plausible genetic algorithms. At mutation-selection equilibrium, a population of genomes is analogous to a sample from a Bayesian posterior, and the genomes are analogous to latent variables. We suggest this is a general, tractable, and insightful formulation of evolutionary computation in terms of standard machine learning concepts and techniques. In addition, we show that evolutionary processes in which selection acts by differences in fecundity are not reversible, and also that it is not possible to construct reversible evolutionary models in which each child is produced by only two parents.