Teleoperation enables a user to perform tasks from a remote location. Hence, the user can interact with a long-distance environment through the operation of a robotic system. Often, teleoperation is required in order to perform dangerous tasks (e.g., work in disaster zones or in chemical plants) while keeping the user out of harm's way. Nevertheless, common approaches often provide cumbersome and unnatural usage. In this letter, we propose TeleFMG, an approach for teleoperation of a multi-finger robotic hand through natural motions of the user's hand. By using a low-cost wearable Force-Myography (FMG) device, musculoskeletal activities on the user's forearm are mapped to hand poses which, in turn, are mimicked by a robotic hand. The mapping is performed by a data-based model that considers spatial positions of the sensors on the forearm along with temporal dependencies of the FMG signals. A set of experiments show the ability of a teleoperator to control a multi-finger hand through intuitive and natural finger motion. Furthermore, transfer to new users is demonstrated.