Recently, federated learning (FL), as a promising distributed machine learning approach, has attracted lots of research efforts. In FL, the parameter server and the mobile devices share the training parameters over wireless links. As a result, reducing the communication overhead becomes one of the most critical challenges. Despite that there have been various communication-efficient machine learning algorithms in literature, few of the existing works consider their implementation over wireless networks. In this work, the idea of SignSGD is adopted and only the signs of the gradients are shared between the mobile devices and the parameter server. In addition, different from most of the existing works that consider Channel State Information (CSI) at both the transmitter side and the receiver side, only receiver side CSI is assumed. In such a case, an essential problem for the mobile devices is to select appropriate local processing and communication parameters. In particular, two tradeoffs are observed under a fixed total training time: (i) given the time for each communication round, the energy consumption versus the outage probability per communication round and (ii) given the energy consumption, the number of communication rounds versus the outage probability per communication round. Two optimization problems regarding the aforementioned two tradeoffs are formulated and solved. The first problem minimizes the energy consumption given the outage probability (and therefore the learning performance) requirement while the second problem optimizes the learning performance given the energy consumption requirement. Extensive simulations are performed to demonstrate the effectiveness of the proposed method.