Abstract:Humans can steadily and gently grasp unfamiliar objects based on tactile perception. Robots still face challenges in achieving similar performance due to the difficulty of learning accurate grasp-force predictions and force control strategies that can be generalized from limited data. In this article, we propose an approach for learning grasping from ideal force control demonstrations, to achieve similar performance of human hands with limited data size. Our approach utilizes objects with known contact characteristics to automatically generate reference force curves without human demonstrations. In addition, we design the dual convolutional neural networks (Dual-CNN) architecture which incorporating a physics-based mechanics module for learning target grasping force predictions from demonstrations. The described method can be effectively applied in vision-based tactile sensors and enables gentle and stable grasping of objects from the ground. The described prediction model and grasping strategy were validated in offline evaluations and online experiments, and the accuracy and generalizability were demonstrated.
Abstract:For elastomer-based tactile sensors, represented by visuotactile sensors, routine calibration of mechanical parameters (Young's modulus and Poisson's ratio) has been shown to be important for force reconstruction. However, the reliance on existing in-situ calibration methods for accurate force measurements limits their cost-effective and flexible applications. This article proposes a new in-situ calibration scheme that relies only on comparing contact deformation. Based on the detailed derivations of the normal contact and torsional contact theories, we designed a simple and low-cost calibration device, EasyCalib, and validated its effectiveness through extensive finite element analysis. We also explored the accuracy of EasyCalib in the practical application and demonstrated that accurate contact distributed force reconstruction can be realized based on the mechanical parameters obtained. EasyCalib balances low hardware cost, ease of operation, and low dependence on technical expertise and is expected to provide the necessary accuracy guarantees for wide applications of visuotactile sensors in the wild.
Abstract:The importance of force perception in interacting with the environment was proven years ago. However, it is still a challenge to measure the contact force distribution accurately in real-time. In order to break through this predicament, we propose a new vision-based tactile sensor, the Tac3D sensor, for measuring the three-dimensional contact surface shape and contact force distribution. In this work, virtual binocular vision is first applied to the tactile sensor, which allows the Tac3D sensor to measure the three-dimensional tactile information in a simple and efficient way and has the advantages of simple structure, low computational costs, and inexpensive. Then, we used contact surface shape and force distribution to estimate the friction coefficient distribution in contact region. Further, combined with the global position of the tactile sensor, the 3D model of the object with friction coefficient distribution is reconstructed. These reconstruction experiments not only demonstrate the excellent performance of the Tac3D sensor but also imply the possibility to optimize the action planning in grasping based on the friction coefficient distribution of the object.