Abstract:Stiffness estimation is crucial for delicate object manipulation in robotic and prosthetic hands but remains challenging due to dependence on force and displacement measurement and real-time sensory integration. This study presents a piezoelectric sensing framework for stiffness estimation at first contact during pinch grasps, addressing the limitations of traditional force-based methods. Inspired by human skin, a multimodal tactile sensor that captures vibrational and force data is developed and integrated into a prosthetic hand's fingertip. Machine learning models, including support vector machines and convolutional neural networks, demonstrate that vibrational signals within the critical 15 ms after first contact reliably encode stiffness, achieving classification accuracies up to 98.6\% and regression errors as low as 2.39 Shore A on real-world objects of varying stiffness. Inference times of less than 1.5 ms are significantly faster than the average grasp closure time (16.65 ms in our dataset), enabling real-time stiffness estimation before the object is fully grasped. By leveraging the transient asymmetry in grasp dynamics, where one finger contacts the object before the others, this method enables early grasp modulation, enhancing safety and intuitiveness in prosthetic hands while offering broad applications in robotics.
Abstract:Humans have an exquisite sense of touch which robotic and prosthetic systems aim to recreate. We developed algorithms to create neuron-like (neuromorphic) spiking representations of texture that are invariant to the scanning speed and contact force applied in the sensing process. The spiking representations are based on mimicking activity from mechanoreceptors in human skin and further processing up to the brain. The neuromorphic encoding process transforms analog sensor readings into speed and force invariant spiking representations in three sequential stages: the force invariance module (in the analog domain), the spiking activity encoding module (transforms from analog to spiking domain), and the speed invariance module (in the spiking domain). The algorithms were tested on a tactile texture dataset collected in 15 speed-force conditions. An offline texture classification system built on the invariant representations has higher classification accuracy, improved computational efficiency, and increased capability to identify textures explored in novel speed-force conditions. The speed invariance algorithm was adapted to a real-time human-operated texture classification system. Similarly, the invariant representations improved classification accuracy, computational efficiency, and capability to identify textures explored in novel conditions. The invariant representation is even more crucial in this context due to human imprecision which seems to the classification system as a novel condition. These results demonstrate that invariant neuromorphic representations enable better performing neurorobotic tactile sensing systems. Furthermore, because the neuromorphic representations are based on biological processing, this work can be used in the future as the basis for naturalistic sensory feedback for upper limb amputees.
Abstract:High-speed tactile arrays are essential for real-time robotic control in unstructured environments, but high pixel counts limit readout rates of most large tactile arrays to below 100Hz. We introduce ACTS - adaptive compressive tactile subsampling - a method that efficiently samples tactile matrices and reconstructs interactions using sparse recovery and a learned tactile dictionary. Tested on a 1024-pixel sensor array (32x32), ACTS increased frame rates by 18X compared to raster scanning, with minimal error. For the first time in large-area tactile skin, we demonstrate rapid object classification within 20ms of contact, high-speed projectile detection, ricochet angle estimation, and deformation tracking through enhanced spatiotemporal resolution. Our method can be implemented in firmware, upgrading existing low-cost, flexible, and robust tactile arrays into high-resolution systems for large-area spatiotemporal touch sensing.