Abstract:During a robot to human object handover task, several intended or unintended events may occur with the object - it may be pulled, pushed, bumped or simply held - by the human receiver. We show that it is possible to differentiate between these events solely via tactile sensors. Training data from tactile sensors were recorded during interaction of human subjects with the object held by a 3-finger robotic hand. A Bag of Words approach was used to automatically extract effective features from the tactile data. A Support Vector Machine was used to distinguish between the four events with over 95 percent average accuracy.