Wirelessly connected devices can collaborately train a machine learning model using federated learning, where the aggregation of model updates occurs using over-the-air computation. Carrier frequency offset caused by imprecise clocks in devices will cause the phase of the over-the-air channel to drift randomly, such that late symbols in a coherence block are transmitted with lower quality than early symbols. To mitigate the effect of degrading symbol quality, we propose a scheme where one of the permutations Roll, Flip and Sort are applied on gradients before transmission. Through simulations we show that the permutations can both improve and degrade learning performance. Furthermore, we derive the expectation and variance of the gradient estimate, which is shown to grow exponentially with the number of symbols in a coherence block.