Abstract:This study evaluates the application of a discrete action space reinforcement learning method (Q-learning) to the continuous control problem of robot inverted pendulum balancing. To speed up the learning process and to overcome technical difficulties related to the direct learning on the real robotic system, the learning phase is performed in simulation environment. A mathematical model of the system dynamics is implemented, deduced by curve fitting on data acquired from the real system. The proposed approach demonstrated feasible, featuring its application on a real world robot that learned to balance an inverted pendulum. This study also reinforces and demonstrates the importance of an accurate representation of the physical world in simulation to achieve a more efficient implementation of reinforcement learning algorithms in real world, even when using a discrete action space algorithm to control a continuous action.
Abstract:This paper proposes a novel method for human hands tracking using data from an event camera. The event camera detects changes in brightness, measuring motion, with low latency, no motion blur, low power consumption and high dynamic range. Captured frames are analysed using lightweight algorithms reporting 3D hand position data. The chosen pick-and-place scenario serves as an example input for collaborative human-robot interactions and in obstacle avoidance for human-robot safety applications. Events data are pre-processed into intensity frames. The regions of interest (ROI) are defined through object edge event activity, reducing noise. ROI features are extracted for use in-depth perception. Event-based tracking of human hand demonstrated feasible, in real time and at a low computational cost. The proposed ROI-finding method reduces noise from intensity images, achieving up to 89% of data reduction in relation to the original, while preserving the features. The depth estimation error in relation to ground truth (measured with wearables), measured using dynamic time warping and using a single event camera, is from 15 to 30 millimetres, depending on the plane it is measured. Tracking of human hands in 3D space using a single event camera data and lightweight algorithms to define ROI features (hands tracking in space).
Abstract:Force and proximity sensors are key in robotics, especially when applied in collaborative robots that interact physically or cognitively with humans in real unstructured environments. However, most existing sensors for use in robotics are limited by: 1) their scope, measuring single parameters/events and often requiring multiple types of sensors, 2) being expensive to manufacture, limiting their use to where they are strictly necessary and often compromising redundancy, and 3) have null or reduced physical flexibility, requiring further costs with adaptation to a variety of robot structures. This paper presents a novel mechanically flexible force and proximity hybrid sensor based on piezoresistive and self-capacitive phenomena. The sensor is inexpensive and easy to apply even on complex-shaped robot structures. The manufacturing process is described, including controlling circuits, mechanical design, and data acquisition. Experimental trials featuring the characterisation of the sensor were conducted, focusing on both force-electrical resistance and self-capacitive proximity response. The sensor's versatility, flexibility, thinness (1 mm thickness), accuracy (reduced drift) and repeatability demonstrated its applicability in several domains. Finally, the sensor was successfully applied in two distinct situations: hand guiding a robot (by touch commands), and human-robot collision avoidance (by proximity detection).
Abstract:Human-robot collision avoidance is a key in collaborative robotics and in the framework of Industry 4.0. It plays an important role for achieving safety criteria while having humans and machines working side-by-side in unstructured and time-varying environment. This study introduces the subject of manipulator's on-line collision avoidance into a real industrial application implementing typical sensors and a commonly used collaborative industrial manipulator, KUKA iiwa. In the proposed methodology, the human co-worker and the robot are represented by geometric primitives (capsules). The minimum distance and relative velocity between them is calculated, when human/obstacles are nearby the concept of hypothetical repulsion and attraction vectors is used. By coupling this concept with a mathematical representation of robot's kinematics, a task level control with collision avoidance capability is achieved. Consequently, the off-line generated nominal path of the industrial task is modified on-the-fly so the robot is able to avoid collision with the co-worker safely while being able to fulfill the industrial operation. To guarantee motion continuity when switching between different tasks, the notion of repulsion-vector-reshaping is introduced. Tests on an assembly robotic cell in automotive industry show that the robot moves smoothly and avoids collisions successfully by adjusting the off-line generated nominal paths.
Abstract:Collaborative robots are increasingly present in our lives. The KUKA LBR iiwa equipped with the KUKA Sunrise.OS controller is a good example of a collaborative/sensitive robot. This paper presents a MATLAB Toolbox, the KUKA Sunrise Toolbox (KST), to interface KUKA Sunrise.OS using MATLAB. The KST contains functionalities for networking, real-time control, point-to-point motion, setters and getters of parameters and physical interaction. KST includes more than 50 functions and runs on a remote computer connected with the KUKA Sunrise controller via transmission control Protocol/Internet Protocol (TCP/IP). The KST potentialities are demonstrated in three use cases.