This paper addresses the problem of safety-critical control of autonomous robots, considering the ubiquitous uncertainties arising from unmodeled dynamics and noisy sensors. To take into account these uncertainties, probabilistic state estimators are often deployed to obtain a belief over possible states. Namely, Particle Filters (PFs) can handle arbitrary non-Gaussian distributions in the robot's state. In this work, we define the belief state and belief dynamics for continuous-discrete PFs and construct safe sets in the underlying belief space. We design a controller that provably keeps the robot's belief state within this safe set. As a result, we ensure that the risk of the unknown robot's state violating a safety specification, such as avoiding a dangerous area, is bounded. We provide an open-source implementation as a ROS2 package and evaluate the solution in simulations and hardware experiments involving high-dimensional belief spaces.