Autonomous robotic systems have gained a lot of attention, in recent years. However, accurate prediction of robot motion in indoor environments with limited visibility is challenging. While vision-based and light detection and ranging (LiDAR) sensors are commonly used for motion detection and localization of robotic arms, they are privacy-invasive and depend on a clear line-of-sight (LOS) for precise measurements. In cases where additional sensors are not available or LOS is not possible, these technologies may not be the best option. This paper proposes a novel method that employs channel state information (CSI) from WiFi signals affected by robotic arm motion. We developed a convolutional neural network (CNN) model to classify four different activities of a Franka Emika robotic arm. The implemented method seeks to accurately predict robot motion even in scenarios in which the robot is obscured by obstacles, without relying on any attached or internal sensors.