Edgify
Abstract:Objects are a centerpiece of the mathematical realm and our interaction with and reasoning about it, just as they are of the physical one (if not more). And humans' mathematical reasoning must ultimately be grounded in our general intelligence. Yet in contemporary cognitive science and A.I., the physical and mathematical domains are customarily explored separately, which allows for baking in assumptions for what objects are for the system - and missing potential connections. In this paper, I put the issue into its philosophical and cognitive context. I then describe an abstract theoretical framework for learning object representations, that makes room for mathematical objects on par with non-mathematical ones. Finally, I describe a case study that builds on that view to show how our general ability for integrating different aspects of objects effects our conception of the natural numbers.
Abstract:We tackle the problem of Federated Learning in the non i.i.d. case, in which local models drift apart, inhibiting learning. Building on an analogy with Lifelong Learning, we adapt a solution for catastrophic forgetting to Federated Learning. We add a penalty term to the loss function, compelling all local models to converge to a shared optimum. We show that this can be done efficiently for communication (adding no further privacy risks), scaling with the number of nodes in the distributed setting. Our experiments show that this method is superior to competing ones for image recognition on the MNIST dataset.