To keep supporting 6G requirements, the radio access infrastructure will increasingly densify. Cell-free (CF) networks offer extreme flexibility by coherently serving users with multiple Access points (APs). This paradigm requires precise and stable phase synchronization. In this article, we adapt the standardized 5G NR setup (subcarrier spacing, OFDM symbol duration and allocation) to investigate the effect of Phase Noise (PN) on the simulated performance of scalable CF networks. In contrast to the prior literature relying on the simplified model of a free-running oscillator with the Wiener process, we deploy a realistic hardware-inspired phase noise model reproducing the Local Oscillator (LO) phase drift. Our results demonstrate that even affordable LOs offer sufficient stability to ensure negligible loss of uplink Spectral Efficiency (SE) on the time scale of the standardized 5G Transmission Time Interval of 1 ms. This study substantiates the feasibility of CF networks based on 5G standards.