Abstract:As the manufacturing industry shifts from mass production to mass customization, there is a growing emphasis on adopting agile, resilient, and human-centric methodologies in line with the directives of Industry 5.0. Central to this transformation is the deployment of digital twins, a technology that digitally replicates manufacturing assets to enable enhanced process optimization, predictive maintenance, synthetic data generation, and accelerated customization and prototyping. This chapter delves into the technologies underpinning the creation of digital twins specifically tailored to agile manufacturing scenarios within the realm of robotic automation. It explores the transfer of trained policies and process optimizations from simulated settings to real-world applications through advanced techniques such as domain randomization, domain adaptation, curriculum learning, and model-based system identification. The chapter also examines various industrial manufacturing automation scenarios, including bin-picking, part inspection, and product assembly, under Sim2Real conditions. The performance of digital twin technologies in these scenarios is evaluated using practical metrics including data latency, adaptation rate, simulation fidelity among others reported, providing a comprehensive assessment of their efficacy and potential impact on modern manufacturing processes.
Abstract:Designing an efficient and resilient human-robot collaboration strategy that not only upholds the safety and ergonomics of shared workspace but also enhances the performance and agility of collaborative setup presents significant challenges concerning environment perception and robot control. In this research, we introduce a novel approach for collaborative environment monitoring and robot motion regulation to address this multifaceted problem. Our study proposes novel computation and division of safety monitoring zones, adhering to ISO 13855 and TS 15066 standards, utilizing 2D lasers information. These zones are not only configured in the standard three-layer arrangement but are also expanded into two adjacent quadrants, thereby enhancing system uptime and preventing unnecessary deadlocks. Moreover, we also leverage 3D visual information to track dynamic human articulations and extended intrusions. Drawing upon the fused sensory data from 2D and 3D perceptual spaces, our proposed hierarchical controller stably regulates robot velocity, validated using Lasalle in-variance principle. Empirical evaluations demonstrate that our approach significantly reduces task execution time and system response delay, resulting in improved efficiency and resilience within collaborative settings.