Autonomy in robotic surgery is very challenging in unstructured environments, especially when interacting with deformable soft tissues. This creates a challenge for model-based control methods that must account for deformation dynamics during tissue manipulation. Previous works in vision-based perception can capture the geometric changes within the scene, however, integration with dynamic properties toachieve accurate and safe model-based controllers has not been considered before. Considering the mechanic coupling between the robot and the environment, it is crucial to develop a registered, simulated dynamical model. In this work, we propose an online, continuous, real-to-sim registration method to bridge from 3D visual perception to position-based dynamics(PBD) modeling of tissues. The PBD method is employed to simulate soft tissue dynamics as well as rigid tool interactions for model-based control. Meanwhile, a vision-based strategy is used to generate 3D reconstructed point cloud surfaces that can be used to register and update the simulation, accounting for differences between the simulation and the real world. To verify this real-to-sim approach, tissue manipulation experiments have been conducted on the da Vinci Researach Kit. Our real-to-sim approach successfully reduced registration errors online, which is especially important for safety during autonomous control. Moreover, the result shows higher accuracy in occluded areas than fusion-based reconstruction.