Various data suggest that the brain carries out probabilistic inference. Models that perform inference through sampling are particularly appealing since instead of requiring networks to perform sophisticated mathematical operations, they can simply exploit stochasticity in neuron behavior. However, sampling from complex distributions is a hard problem. In particular, mixing behavior is often very sensitive to the temperature parameter which controls the stochasticity of the sampler. We propose that background oscillations, an ubiquitous phenomenon throughout the brain, can mitigate this issue and thus implement the backbone for sampling-based computations in spiking neural networks. We first show that both in current-based and conductance-based neuron models, the level of background activity effectively defines the sampling temperature of the network. This mechanism allows brain networks to flexibly control the sampling behavior, either favoring convergence to local optima or promoting mixing. We then demonstrate that background oscillations can in this way structure stochastic computations into discrete sampling episodes. In each such episode, solutions are first explored at high temperatures before annealing to low temperatures favors convergence to a good solution.