ISIR
Abstract:The human hand is an immensely sophisticated tool adept at manipulating and grasping objects of unknown characteristics. Its capability lies in perceiving interaction dynamics through touch and adjusting contact force direction and magnitude to ensure successful manipulation. Despite advancements in control algorithms, sensing technologies, compliance integration, and ongoing research, precise finger force control for dexterous manipulation using tactile sensing remains relatively unexplored.In this work, we explore the challenges related to individual finger contact force control and propose a method for directing such forces perceived through tactile sensing. The proposed method is evaluated using an Allegro hand with Xela tactile sensors. Results are presented and discussed, alongside consideration for potential future improvements.
Abstract:Dexterous in-hand manipulation is a unique and valuable human skill requiring sophisticated sensorimotor interaction with the environment while respecting stability constraints. Satisfying these constraints with generated motions is essential for a robotic platform to achieve reliable in-hand manipulation skills. Explicitly modelling these constraints can be challenging, but they can be implicitly modelled and learned through experience or human demonstrations. We propose a learning and control approach based on dictionaries of motion primitives generated from human demonstrations. To achieve this, we defined an optimization process that combines motion primitives to generate robot fingertip trajectories for moving an object from an initial to a desired final pose. Based on our experiments, our approach allows a robotic hand to handle objects like humans, adhering to stability constraints without requiring explicit formalization. In other words, the proposed motion primitive dictionaries learn and implicitly embed the constraints crucial to the in-hand manipulation task.
Abstract:Robotic capacities in object manipulation are incomparable to those of humans. Besides years of learning, humans rely heavily on the richness of information from physical interaction with the environment. In particular, tactile sensing is crucial in providing such rich feedback. Despite its potential contributions to robotic manipulation, tactile sensing is less exploited; mainly due to the complexity of the time series provided by tactile sensors. In this work, we propose a method for assessing grasp stability using tactile sensing. More specifically, we propose a methodology to extract task-relevant features and design efficient classifiers to detect object slippage with respect to individual fingertips. We compare two classification models: support vector machine and logistic regression. We use highly sensitive Uskin tactile sensors mounted on an Allegro hand to test and validate our method. Our results demonstrate that the proposed method is effective in slippage detection in an online fashion.
Abstract:Dexterous in-hand manipulation is a peculiar and useful human skill. This ability requires the coordination of many senses and hand motion to adhere to many constraints. These constraints vary and can be influenced by the object characteristics or the specific application. One of the key elements for a robotic platform to implement reliable inhand manipulation skills is to be able to integrate those constraints in their motion generations. These constraints can be implicitly modelled, learned through experience or human demonstrations. We propose a method based on motion primitives dictionaries to learn and reproduce in-hand manipulation skills. In particular, we focused on fingertip motions during the manipulation, and we defined an optimization process to combine motion primitives to reach specific fingertip configurations. The results of this work show that the proposed approach can generate manipulation motion coherent with the human one and that manipulation constraints are inherited even without an explicit formalization.