Abstract:Distributed tactile sensing for multi-force detection is crucial for various aerial robot interaction tasks. However, current contact sensing solutions on drones only exploit single end-effector sensors and cannot provide distributed multi-contact sensing. Designed to be easily mounted at the bottom of a drone, we propose an optical tactile sensor that features a large and curved soft sensing surface, a hollow structure and a new illumination system. Even when spaced only 2 cm apart, multiple contacts can be detected simultaneously using our software pipeline, which provides real-world quantities of 3D contact locations (mm) and 3D force vectors (N), with an accuracy of 1.5 mm and 0.17 N respectively. We demonstrate the sensor's applicability and reliability onboard and in real-time with two demos related to i) the estimation of the compliance of different perches and subsequent re-alignment and landing on the stiffer one, and ii) the mapping of sparse obstacles. The implementation of our distributed tactile sensor represents a significant step towards attaining the full potential of drones as versatile robots capable of interacting with and navigating within complex environments.
Abstract:This work presents the mechanical design and control of a novel small-size and lightweight Micro Aerial Vehicle (MAV) for aerial manipulation. To our knowledge, with a total take-off mass of only 2.0 kg, the proposed system is the most lightweight Aerial Manipulator (AM) that has 8-DOF independently controllable: 5 for the aerial platform and 3 for the articulated arm. We designed the robot to be fully-actuated in the body forward direction. This allows independent pitching and instantaneous force generation, improving the platform's performance during physical interaction. The robotic arm is an origami delta manipulator driven by three servomotors, enabling active motion compensation at the end-effector. Its composite multimaterial links help reduce the weight, while their flexibility allow for compliant aerial interaction with the environment. In particular, the arm's stiffness can be changed according to its configuration. We provide an in depth discussion of the system design and characterize the stiffness of the delta arm. A control architecture to deal with the platform's overactuation while exploiting the delta arm is presented. Its capabilities are experimentally illustrated both in free flight and physical interaction, highlighting advantages and disadvantages of the origami's folding mechanism.
Abstract:Gesture-based interfaces are often used to achieve a more natural and intuitive teleoperation of robots. Yet, sometimes, gesture control requires postures or movements that cause significant fatigue to the user. In a previous user study, we demonstrated that na\"ive users can control a fixed-wing drone with torso movements while their arms are spread out. However, this posture induced significant arm fatigue. In this work, we present a passive arm support that compensates the arm weight with a mean torque error smaller than 0.005 N/kg for more than 97% of the range of motion used by subjects to fly, therefore reducing muscular fatigue in the shoulder of on average 58%. In addition, this arm support is designed to fit users from the body dimension of the 1st percentile female to the 99th percentile male. The performance analysis of the arm support is described with a mechanical model and its implementation is validated with both a mechanical characterization and a user study, which measures the flight performance, the shoulder muscle activity and the user acceptance.
Abstract:Most human-robot interfaces, such as joysticks and keyboards, require training and constant cognitive effort and provide a limited degree of awareness of the robots state and its environment. Embodied interactions, instead of interfaces, could bridge the gap between humans and robots, allowing humans to naturally perceive and act through a distal robotic body. Establishing an embodied interaction and mapping human movements and a non-anthropomorphic robot is particularly challenging. In this paper, we describe a natural and immersive embodied interaction that allows users to control and experience drone flight with their own bodies. The setup uses a commercial flight simulator that tracks hand movements and provides haptic and visual feedback. The paper discusses how to integrate the simulator with a real drone, how to map body movement with drone motion, and how the resulting embodied interaction provides a more natural and immersive flight experience to unskilled users with respect to a conventional RC remote controller.