Abstract:The pursuit of creating artificial intelligence (AI) mirrors our longstanding fascination with understanding our own intelligence. From the myths of Talos to Aristotelian logic and Heron's inventions, we have sought to replicate the marvels of the mind. While recent advances in AI hold promise, singular approaches often fall short in capturing the essence of intelligence. This paper explores how fundamental principles from biological computation--particularly context-dependent, hierarchical information processing, trial-and-error heuristics, and multi-scale organization--can guide the design of truly intelligent systems. By examining the nuanced mechanisms of biological intelligence, such as top-down causality and adaptive interaction with the environment, we aim to illuminate potential limitations in artificial constructs. Our goal is to provide a framework inspired by biological systems for designing more adaptable and robust artificial intelligent systems.
Abstract:Diffusion Models represent a significant advancement in generative modeling, employing a dual-phase process that first degrades domain-specific information via Gaussian noise and restores it through a trainable model. This framework enables pure noise-to-data generation and modular reconstruction of, images or videos. Concurrently, evolutionary algorithms employ optimization methods inspired by biological principles to refine sets of numerical parameters encoding potential solutions to rugged objective functions. Our research reveals a fundamental connection between diffusion models and evolutionary algorithms through their shared underlying generative mechanisms: both methods generate high-quality samples via iterative refinement on random initial distributions. By employing deep learning-based diffusion models as generative models across diverse evolutionary tasks and iteratively refining diffusion models with heuristically acquired databases, we can iteratively sample potentially better-adapted offspring parameters, integrating them into successive generations of the diffusion model. This approach achieves efficient convergence toward high-fitness parameters while maintaining explorative diversity. Diffusion models introduce enhanced memory capabilities into evolutionary algorithms, retaining historical information across generations and leveraging subtle data correlations to generate refined samples. We elevate evolutionary algorithms from procedures with shallow heuristics to frameworks with deep memory. By deploying classifier-free guidance for conditional sampling at the parameter level, we achieve precise control over evolutionary search dynamics to further specific genotypical, phenotypical, or population-wide traits. Our framework marks a major heuristic and algorithmic transition, offering increased flexibility, precision, and control in evolutionary optimization processes.
Abstract:In a convergence of machine learning and biology, we reveal that diffusion models are evolutionary algorithms. By considering evolution as a denoising process and reversed evolution as diffusion, we mathematically demonstrate that diffusion models inherently perform evolutionary algorithms, naturally encompassing selection, mutation, and reproductive isolation. Building on this equivalence, we propose the Diffusion Evolution method: an evolutionary algorithm utilizing iterative denoising -- as originally introduced in the context of diffusion models -- to heuristically refine solutions in parameter spaces. Unlike traditional approaches, Diffusion Evolution efficiently identifies multiple optimal solutions and outperforms prominent mainstream evolutionary algorithms. Furthermore, leveraging advanced concepts from diffusion models, namely latent space diffusion and accelerated sampling, we introduce Latent Space Diffusion Evolution, which finds solutions for evolutionary tasks in high-dimensional complex parameter space while significantly reducing computational steps. This parallel between diffusion and evolution not only bridges two different fields but also opens new avenues for mutual enhancement, raising questions about open-ended evolution and potentially utilizing non-Gaussian or discrete diffusion models in the context of Diffusion Evolution.
Abstract:Developing methods to explore, predict and control the dynamic behavior of biological systems, from protein pathways to complex cellular processes, is an essential frontier of research for bioengineering and biomedicine. Thus, significant effort has gone in computational inference and mathematical modeling of biological systems. This effort has resulted in the development of large collections of publicly-available models, typically stored and exchanged on online platforms (such as the BioModels Database) using the Systems Biology Markup Language (SBML), a standard format for representing mathematical models of biological systems. SBMLtoODEjax is a lightweight library that allows to automatically parse and convert SBML models into python models written end-to-end in JAX, a high-performance numerical computing library with automatic differentiation capabilities. SBMLtoODEjax is targeted at researchers that aim to incorporate SBML-specified ordinary differential equation (ODE) models into their python projects and machine learning pipelines, in order to perform efficient numerical simulation and optimization with only a few lines of code. SBMLtoODEjax is available at https://github.com/flowersteam/sbmltoodejax.
Abstract:The applicability of computational models to the biological world is an active topic of debate. We argue that a useful path forward results from abandoning hard boundaries between categories and adopting an observer-dependent, pragmatic view. Such a view dissolves the contingent dichotomies driven by human cognitive biases (e.g., tendency to oversimplify) and prior technological limitations in favor of a more continuous, gradualist view necessitated by the study of evolution, developmental biology, and intelligent machines. Efforts to re-shape living systems for biomedical or bioengineering purposes require prediction and control of their function at multiple scales. This is challenging for many reasons, one of which is that living systems perform multiple functions in the same place at the same time. We refer to this as "polycomputing" - the ability of the same substrate to simultaneously compute different things. This ability is an important way in which living things are a kind of computer, but not the familiar, linear, deterministic kind; rather, living things are computers in the broad sense of computational materials as reported in the rapidly-growing physical computing literature. We argue that an observer-centered framework for the computations performed by evolved and designed systems will improve the understanding of meso-scale events, as it has already done at quantum and relativistic scales. Here, we review examples of biological and technological polycomputing, and develop the idea that overloading of different functions on the same hardware is an important design principle that helps understand and build both evolved and designed systems. Learning to hack existing polycomputing substrates, as well as evolve and design new ones, will have massive impacts on regenerative medicine, robotics, and computer engineering.
Abstract:All cognitive agents are composite beings. Specifically, complex living agents consist of cells, which are themselves competent sub-agents navigating physiological and metabolic spaces. Behavior science, evolutionary developmental biology, and the field of machine intelligence all seek an answer to the scaling of biological cognition: what evolutionary dynamics enable individual cells to integrate their activities to result in the emergence of a novel, higher-level intelligence that has goals and competencies that belong to it and not to its parts? Here, we report the results of simulations based on the TAME framework, which proposes that evolution pivoted the collective intelligence of cells during morphogenesis of the body into traditional behavioral intelligence by scaling up the goal states at the center of homeostatic processes. We tested the hypothesis that a minimal evolutionary framework is sufficient for small, low-level setpoints of metabolic homeostasis in cells to scale up into collectives (tissues) which solve a problem in morphospace: the organization of a body-wide positional information axis (the classic French Flag problem). We found that these emergent morphogenetic agents exhibit a number of predicted features, including the use of stress propagation dynamics to achieve its target morphology as well as the ability to recover from perturbation (robustness) and long-term stability (even though neither of these was directly selected for). Moreover we observed unexpected behavior of sudden remodeling long after the system stabilizes. We tested this prediction in a biological system - regenerating planaria - and observed a very similar phenomenon. We propose that this system is a first step toward a quantitative understanding of how evolution scales minimal goal-directed behavior (homeostatic loops) into higher-level problem-solving agents in morphogenetic and other spaces.
Abstract:We show how any system with morphological degrees of freedom and locally limited free energy will, under the constraints of the free energy principle, evolve toward a neuromorphic morphology that supports hierarchical computations in which each level of the hierarchy enacts a coarse-graining of its inputs, and dually a fine-graining of its outputs. Such hierarchies occur throughout biology, from the architectures of intracellular signal transduction pathways to the large-scale organization of perception and action cycles in the mammalian brain. Formally, the close formal connections between cone-cocone diagrams (CCCD) as models of quantum reference frames on the one hand, and between CCCDs and topological quantum field theories on the other, allow the representation of such computations in the fully-general quantum-computational framework of topological quantum neural networks.
Abstract:A common view in the neuroscience community is that memory is encoded in the connection strength between neurons. This perception led artificial neural network models to focus on connection weights as the key variables to modulate learning. In this paper, we present a prototype for weightless spiking neural networks that can perform a simple classification task. The memory in this network is stored in the timing between neurons, rather than the strength of the connection, and is trained using a Hebbian Spike Timing Dependent Plasticity (STDP), which modulates the delays of the connection.
Abstract:Robots deployed at orders of magnitude different size scales, and that retain the same desired behavior at any of those scales, would greatly expand the environments in which the robots could operate. However it is currently not known whether such robots exist, and, if they do, how to design them. Since self similar structures in nature often exhibit self similar behavior at different scales, we hypothesize that there may exist robot designs that have the same property. Here we demonstrate that this is indeed the case for some, but not all, modular soft robots: there are robot designs that exhibit a desired behavior at a small size scale, and if copies of that robot are attached together to realize the same design at higher scales, those larger robots exhibit similar behavior. We show how to find such designs in simulation using an evolutionary algorithm. Further, when fractal attachment is not assumed and attachment geometries must thus be evolved along with the design of the base robot unit, scale invariant behavior is not achieved, demonstrating that structural self similarity, when combined with appropriate designs, is a useful path to realizing scale invariant robot behavior. We validate our findings by demonstrating successful transferal of self similar structure and behavior to pneumatically-controlled soft robots. Finally, we show that biobots can spontaneously exhibit self similar attachment geometries, thereby suggesting that self similar behavior via self similar structure may be realizable across a wide range of robot platforms in future.
Abstract:The field of basal cognition seeks to understand how adaptive, context-specific behavior occurs in non-neural biological systems. Embryogenesis and regeneration require plasticity in many tissue types to achieve structural and functional goals in diverse circumstances. Thus, advances in both evolutionary cell biology and regenerative medicine require an understanding of how non-neural tissues could process information. Neurons evolved from ancient cell types that used bioelectric signaling to perform computation. However, it has not been shown whether or how non-neural bioelectric cell networks can support computation. We generalize connectionist methods to non-neural tissue architectures, showing that a minimal non-neural Bio-Electric Network (BEN) model that utilizes the general principles of bioelectricity (electrodiffusion and gating) can compute. We characterize BEN behaviors ranging from elementary logic gates to pattern detectors, using both fixed and transient inputs to recapitulate various biological scenarios. We characterize the mechanisms of such networks using dynamical-systems and information-theory tools, demonstrating that logic can manifest in bidirectional, continuous, and relatively slow bioelectrical systems, complementing conventional neural-centric architectures. Our results reveal a variety of non-neural decision-making processes as manifestations of general cellular biophysical mechanisms and suggest novel bioengineering approaches to construct functional tissues for regenerative medicine and synthetic biology as well as new machine learning architectures.