Intelligent Personal Assistants (IPAs) are software agents that can perform tasks on behalf of individuals and assist them on many of their daily activities. IPAs capabilities are expanding rapidly due to the recent advances on areas such as natural language processing, machine learning, artificial cognition, and ubiquitous computing, which equip the agents with competences to understand what users say, collect information from everyday ubiquitous devices (e.g., smartphones, wearables, tablets, laptops, cars, household appliances, etc.), learn user preferences, deliver data-driven search results, and make decisions based on user's context. Apart from the inherent complexity of building such IPAs, developers and researchers have to address many critical architectural challenges (e.g., low-latency, scalability, concurrency, ubiquity, code mobility, interoperability, support to cognitive services and reasoning, to name a few.), thereby diverting them from their main goal: building IPAs. Thus, our contribution in this paper is twofold: 1) we propose an architecture for a platform-agnostic, high-performance, ubiquitous, and distributed middleware that alleviates the burdensome task of dealing with low-level implementation details when building IPAs by adding multiple abstraction layers that hide the underlying complexity; and 2) we present an implementation of the middleware that concretizes the aforementioned architecture and allows the development of high-level capabilities while scaling the system up to hundreds of thousands of IPAs with no extra effort. We demonstrate the powerfulness of our middleware by analyzing software metrics for complexity, effort, performance, cohesion and coupling when developing a conversational IPA.