Abstract:Bringing tactile sensation to robotic hands will allow for more effective grasping, along with the wide range of benefits of human-like touch. Here we present a 3d-printed, three-fingered tactile robot hand comprising an OpenHand Model~O customized to house a TacTip optical biomimetic tactile sensor in the distal phalanx of each finger. We expect that the grasping capabilities of the Model O combined with the benefits of sophisticated tactile sensing will result in an effective platform -- the tactile Model O (T-MO). Our current T-MO design uses three JeVois machine vision systems, each comprising a miniature camera in the tactile fingertip with a vision processing module in the base of the hand. To evaluate the capabilities of the T-MO, we benchmark its grasping performance using the Gripper Assessment Benchmark on the YCB object set. We then tested its tactile sensing capabilities with two experiments: firstly, tactile object classification on a subset of objects that can be reliably grasped, and secondly, predicting whether a grasp will successfully lift one of these objects under randomly perturbed grasps that sometimes fail. In all cases, the results are consistent with the state-of-the-art, taking advantage of advances in deep learning and convolutional neural networks from computer vision that apply to the tactile image outputs. Overall, this work demonstrates that the T-MO is an effective platform for robot hand research and we expect it to open-up a range of applications in autonomous object handling. Video: https://youtu.be/oZ41U5pyK6Y
Abstract:Here we propose and investigate a novel vibrissal tactile sensor - the TacWhisker array - based on modifying a 3D-printed optical cutaneous (fingertip) tactile sensor - the TacTip. Two versions are considered: a static TacWhisker array analogous to immotile tactile vibrissae (e.g. rodent microvibrissae) and a dynamic TacWhisker array analogous to motile tactile vibrissae (e.g. rodent macrovibrissae). Performance is assessed on an active object localization task. The whisking motion of the dynamic TacWhisker leads to millimetre-scale location perception, whereas perception with the static TacWhisker array is relatively poor when making dabbing contacts. The dynamic sensor output is dominated by a self-generated motion signal, which can be compensated by comparing to a reference signal. Overall, the TacWhisker arrays give a new class of tactile whiskered robots that benefit from being relatively inexpensive and customizable. Furthermore, the biomimetic basis for the TacWhiskers fits well with building an embodied model of the rodent sensory system for investigating animal perception. A video demonstrating this robot can be seen at https://www.youtube.com/watch?v=ksS177ep6yY
Abstract:There are a wide range of features that tactile contact provides, each with different aspects of information that can be used for object grasping, manipulation, and perception. In this paper inference of some key tactile features, tip displacement, contact location, shear direction and magnitude, is demonstrated by introducing a novel method of transducing a third dimension to the sensor data via Voronoi tessellation. The inferred features are displayed throughout the work in a new visualisation mode derived from the Voronoi tessellation; these visualisations create easier interpretation of data from an optical tactile sensor that measures local shear from displacement of internal pins (the TacTip). The output values of tip displacement and shear magnitude are calibrated to appropriate mechanical units and validate the direction of shear inferred from the sensor. We show that these methods can infer the direction of shear to $\sim$2.3$^{\circ}$ without the need for training a classifier or regressor. The approach demonstrated here will increase the versatility and generality of the sensors and thus allow sensor to be used in more unstructured and unknown environments, as well as improve the use of these tactile sensors in more complex systems such as robot hands.