Sparse arrays have emerged as a popular alternative to the conventional uniform linear array (ULA) due to the enhanced degrees of freedom (DOF) and superior resolution offered by them. In the passive setting, these advantages are realized by leveraging correlation between the received signals at different sensors. This has led to the belief that sparse arrays require a large number of temporal measurements to reliably estimate parameters of interest from these correlations, and therefore they may not be preferred in the sample-starved regime. In this paper, we debunk this myth by performing a rigorous non-asymptotic analysis of the Coarray ESPRIT algorithm. This seemingly counter-intuitive result is a consequence of the scaling of the singular value of the coarray manifold, which compensates for the potentially large covariance estimation error in the limited snapshot regime. Specifically, we show that for a nested array operating in the regime of fewer sources than sensors ($S=O(1)$), it is possible to bound the matching distance error between the estimated and true directions of arrival (DOAs) by an arbitrarily small quantity ($\epsilon$) with high probability, provided (i) the number of temporal snapshots ($L$) scales only logarithmically with the number of sensors ($P$), i.e. $L=\Omega(\ln(P)/\epsilon^2)$, and (ii) a suitable separation condition is satisfied. Our results also formally prove the well-known empirical resolution benefits of sparse arrays, by establishing that the minimum separation between sources can be $\Omega(1/P^2)$, as opposed to separation $\Omega(1/P)$ required by a ULA with the same number of sensors. Our sample complexity expression reveals the dependence on other key model parameters such as SNR and the dynamic range of the source powers. This enables us to establish the superior noise-resilience of nested arrays both theoretically and empirically.