Abstract:Soft pneumatic actuators (SPA) made from elastomeric materials can provide large strain and large force. The behavior of locally strain-restricted hyperelastic materials under inflation has been investigated thoroughly for shape reconfiguration, but requires further investigation for trajectories involving external force. In this work we model force-pressure-height relationships for a concentrically strain-limited class of soft pneumatic actuators and demonstrate the use of this model to design SPA response for object lifting. We predict relationships under different loadings by solving energy minimization equations and verify this theory by using an automated test rig to collect rich data for n=22 Ecoflex 00-30 membranes. We collect this data using an active learning pipeline to efficiently model the design space. We show that this learned material model outperforms the theory-based model and naive curve-fitting approaches. We use our model to optimize membrane design for different lift tasks and compare this performance to other designs. These contributions represent a step towards understanding the natural response for this class of actuator and embodying intelligent lifts in a single-pressure input actuator system.
Abstract:Soft robotic actuators are safe and adaptable devices with inherent compliance, which makes them attractive for manipulating delicate and complex objects. Researchers have integrated stiff materials into soft actuators to increase their force capacity and direct their deformation. However, these embedded materials have largely been pre-prescribed and static, which constrains the actuators to a predetermined range of motion. In this work, electroadhesive (EA) clutches integrated on a single-chamber soft pneumatic actuator (SPA) provide local programmable stiffness modulation to control the actuator deformation. We show that activating different clutch patterns inflates a silicone membrane into pyramidal, round, and plateau shapes. Curvatures from these shapes are combined during actuation to apply forces on both a 3.7 g and 820 g object along five different degrees of freedom (DoF). The actuator workspace is up to 12 mm for light objects. Clutch deactivation, which results in local elastomeric expansion, rapidly applies forces up to 3.2 N to an object resting on the surface and launches a 3.7 g object in controlled directions. The actuator also rotates a heavier, 820 g, object by 5 degrees and rapidly restores it to horizontal alignment after clutch deactivation. This actuator is fully powered by a 5 V battery, AA battery, DC-DC transformer, and 4.5 V (63 g) DC air pump. These results demonstrate a first step towards realizing a soft actuator with high DoF shape change that preserves the inherent benefits of pneumatic actuation while gaining the electrical controllability and strength of EA clutches. We envision such a system supplying human contact forces in the form of a low-profile sit-to-stand assistance device, bed-ridden patient manipulator, or other ergonomic mechanism. This technology was also demonstrated at ICRA 2022: https://www.youtube.com/watch?v=6Y6-iHWNi6s
Abstract:The most common sensing modalities found in a robot perception system are vision and touch, which together can provide global and highly localized data for manipulation. However, these sensing modalities often fail to adequately capture the behavior of target objects during the critical moments as they transition out of static, controlled contact with an end-effector to dynamic and uncontrolled motion. In this work, we present a novel multimodal visuotactile sensor that provides simultaneous visuotactile and proximity depth data. The sensor integrates an RGB camera and air pressure sensor to sense touch with an infrared time-of-flight (ToF) camera to sense proximity by leveraging a selectively transmissive soft membrane to enable the dual sensing modalities. We present the mechanical design, fabrication techniques, algorithm implementations, and evaluation of the sensor's tactile and proximity modalities. The sensor is demonstrated in three open-loop robotic tasks: approaching and contacting an object, catching, and throwing. The fusion of tactile and proximity data could be used to capture key information about a target object's transition behavior for sensor-based control in dynamic manipulation.