Scientific Machine Learning (ML) is gaining momentum as a cost-effective alternative to physics-based numerical solvers in many engineering applications. In fact, scientific ML is currently being used to build accurate and efficient surrogate models starting from high-fidelity numerical simulations, effectively encoding the parameterized temporal dynamics underlying Ordinary Differential Equations (ODEs), or even the spatio-temporal behavior underlying Partial Differential Equations (PDEs), in appropriately designed neural networks. We propose an extension of Latent Dynamics Networks (LDNets), namely Liquid Fourier LDNets (LFLDNets), to create parameterized space-time surrogate models for multiscale and multiphysics sets of highly nonlinear differential equations on complex geometries. LFLDNets employ a neurologically-inspired, sparse, liquid neural network for temporal dynamics, relaxing the requirement of a numerical solver for time advancement and leading to superior performance in terms of tunable parameters, accuracy, efficiency and learned trajectories with respect to neural ODEs based on feedforward fully-connected neural networks. Furthermore, in our implementation of LFLDNets, we use a Fourier embedding with a tunable kernel in the reconstruction network to learn high-frequency functions better and faster than using space coordinates directly as input. We challenge LFLDNets in the framework of computational cardiology and evaluate their capabilities on two 3-dimensional test cases arising from multiscale cardiac electrophysiology and cardiovascular hemodynamics. This paper illustrates the capability to run Artificial Intelligence-based numerical simulations on single or multiple GPUs in a matter of minutes and represents a significant step forward in the development of physics-informed digital twins.