Fair principal component analysis (FPCA), a ubiquitous dimensionality reduction technique in signal processing and machine learning, aims to find a low-dimensional representation for a high-dimensional dataset in view of fairness. The FPCA problem is a non-convex and non-smooth optimization over the Stiefel manifold. The state-of-the-art methods for solving the problem are subgradient methods and semidefinite relaxation based methods. However, these two types of methods have their obvious limitations and thus are only suitable for efficiently solving the FPCA problem in very special scenarios. The goal of this paper is to develop efficient algorithms for solving the FPCA problem in general settings, especially the very high-dimensional setting. In this paper, we first transform the problem into a smooth non-convex concave minimax optimization over the Stiefel manifold. Then we propose an alternating Riemannian gradient (ARG) algorithm, which performs a Riemannian gradient descent step and an ordinary gradient projection step at each iteration, for solving the general non-convex concave minimax problems over Riemannian manifolds. We prove that ARG can find an $\varepsilon$-stationary point of the above problem within $O(\varepsilon^{-4})$ iterations. Simulation results show that, compared with the state-of-the-art methods, our proposed ARG algorithm can achieve better performance in terms of the solution quality and speed for solving the FPCA problems arising from signal processing and machine learning.