Recent advancements in space technology have equipped low Earth Orbit (LEO) satellites with the capability to perform complex functions and run AI applications. Federated Learning (FL) on LEO satellites enables collaborative training of a global ML model without the need for sharing large datasets. However, intermittent connectivity between satellites and ground stations can lead to stale gradients and unstable learning, thereby limiting learning performance. In this paper, we propose FedGSM, a novel asynchronous FL algorithm that introduces a compensation mechanism to mitigate gradient staleness. FedGSM leverages the deterministic and time-varying topology of the orbits to offset the negative effects of staleness. Our simulation results demonstrate that FedGSM outperforms state-of-the-art algorithms for both IID and non-IID datasets, underscoring its effectiveness and advantages. We also investigate the effect of system parameters.