Abstract:In human-robot collaborative interaction scenarios, nonverbal communication plays an important role. Both, signals sent by a human collaborator need to be identified and interpreted by the robotic system, and the signals sent by the robot need to be identified and interpreted by the human. In this paper, we focus on the latter. We implemented on an industrial robot in a VR environment nonverbal behavior signalling the user that it is now their turn to proceed with a pick-and-place task. The signals were presented in four different test conditions: no signal, robot arm gesture, light signal, combination of robot arm gesture and light signal. Test conditions were presented to the participants in two rounds. The qualitative analysis was conducted with focus on (i) potential signals in human behaviour indicating why some participants immediately took over from the robot whereas others needed more time to explore, (ii) human reactions after the nonverbal signal of the robot, and (iii) whether participants showed different behaviours in the different test conditions. We could not identify potential signals why some participants were immediately successful and others not. There was a bandwidth of behaviors after the robot stopped working, e.g. participants rearranged the objects, looked at the robot or the object, or gestured the robot to proceed. We found evidence that robot deictic gestures were helpful for the human to correctly interpret what to do next. Moreover, there was a strong tendency that humans interpreted the light signal projected on the robot's gripper as a request to give the object in focus to the robot. Whereas a robot's pointing gesture at the object was a strong trigger for the humans to look at the object.
Abstract:With increased applications of collaborative robots (cobots) in industrial workplaces, behavioural effects of human-cobot interactions need to be further investigated. This is of particular importance as nonverbal behaviours of collaboration partners in human-robot teams significantly influence the experience of the human interaction partners and the success of the collaborative task. During the Ars Electronica 2020 Festival for Art, Technology and Society (Linz, Austria), we invited visitors to exploratively interact with an industrial robot, exhibiting restricted interaction capabilities: extending and retracting its arm, depending on the movements of the volunteer. The movements of the arm were pre-programmed and telecontrolled for safety reasons (which was not obvious to the participants). We recorded video data of these interactions and investigated general nonverbal behaviours of the humans interacting with the robot, as well as nonverbal behaviours of people in the audience. Our results showed that people were more interested in exploring the robot's action and perception capabilities than just reproducing the interaction game as introduced by the instructors. We also found that the majority of participants interacting with the robot approached it up to a distance which would be perceived as threatening or intimidating, if it were a human interaction partner. Regarding bystanders, we found examples where people made movements as if trying out variants of the current participant's behaviour.