Abstract:Grasping the same object in different postures is often necessary, especially when handling tools or stacked items. Due to unknown object properties and changes in grasping posture, the required grasping force is uncertain and variable. Traditional methods rely on real-time feedback to control the grasping force cautiously, aiming to prevent slipping or damage. However, they overlook reusable information from the initial grasp, treating subsequent regrasping attempts as if they were the first, which significantly reduces efficiency. To improve this, we propose a method that utilizes perception from prior grasping attempts to predict the required grasping force, even with changes in position. We also introduce a calculation method that accounts for fingertip softness and object asymmetry. Theoretical analyses demonstrate the feasibility of predicting grasping forces across various postures after a single grasp. Experimental verifications attest to the accuracy and adaptability of our prediction method. Furthermore, results show that incorporating the predicted grasping force into feedback-based approaches significantly enhances grasping efficiency across a range of everyday objects.
Abstract:Object reorientation is a critical task for robotic grippers, especially when manipulating objects within constrained environments. The task poses significant challenges for motion planning due to the high-dimensional output actions with the complex input information, including unknown object properties and nonlinear contact forces. Traditional approaches simplify the problem by reducing degrees of freedom, limiting contact forms, or acquiring environment/object information in advance, which significantly compromises adaptability. To address these challenges, we deconstruct the complex output actions into three fundamental types based on tactile sensing: task-oriented actions, constraint-oriented actions, and coordinating actions. These actions are then optimized online using gradient optimization to enhance adaptability. Key contributions include simplifying contact state perception, decomposing complex gripper actions, and enabling online action optimization for handling unknown objects or environmental constraints. Experimental results demonstrate that the proposed method is effective across a range of everyday objects, regardless of environmental contact. Additionally, the method exhibits robust performance even in the presence of unknown contacts and nonlinear external disturbances.
Abstract:Incipient slip detection provides critical feedback for robotic grasping and manipulation tasks. However, maintaining its adaptability under diverse object properties and complex working conditions remains challenging. This article highlights the importance of completely representing spatio-temporal features of slip, and proposes a novel approach for incipient slip modeling and detection. Based on the analysis of localized displacement phenomenon, we establish the relationship between the characteristic strain rate extreme events and the local slip state. This approach enables the detection of both the spatial distribution and temporal dynamics of stick-slip regions. Also, the proposed method can be applied to strain distribution sensing devices, such as vision-based tactile sensors. Simulations and prototype experiments validated the effectiveness of this approach under varying contact conditions, including different contact geometries, friction coefficients, and combined loads. Experiments demonstrated that this method not only accurately and reliably delineates incipient slip, but also facilitates friction parameter estimation and adaptive grasping control.