We propose Geometric Neural Parametric Models (GNPM), a learned parametric model that takes into account the local structure of data to learn disentangled shape and pose latent spaces of 4D dynamics, using a geometric-aware architecture on point clouds. Temporally consistent 3D deformations are estimated without the need for dense correspondences at training time, by exploiting cycle consistency. Besides its ability to learn dense correspondences, GNPMs also enable latent-space manipulations such as interpolation and shape/pose transfer. We evaluate GNPMs on various datasets of clothed humans, and show that it achieves comparable performance to state-of-the-art methods that require dense correspondences during training.