Abstract:The use of autonomous systems in medical evacuation (MEDEVAC) scenarios is promising, but existing implementations overlook key insights from human-robot interaction (HRI) research. Studies on human-machine teams demonstrate that human perceptions of a machine teammate are critical in governing the machine's performance. Here, we present a mixed factorial design to assess human perceptions of a MEDEVAC robot in a simulated evacuation scenario. Participants were assigned to the role of casualty (CAS) or bystander (BYS) and subjected to three within-subjects conditions based on the MEDEVAC robot's operating mode: autonomous-slow (AS), autonomous-fast (AF), and teleoperation (TO). During each trial, a MEDEVAC robot navigated an 11-meter path, acquiring a casualty and transporting them to an ambulance exchange point while avoiding an idle bystander. Following each trial, subjects completed a questionnaire measuring their emotional states, perceived safety, and social compatibility with the robot. Results indicate a consistent main effect of operating mode on reported emotional states and perceived safety. Pairwise analyses suggest that the employment of the AF operating mode negatively impacted perceptions along these dimensions. There were no persistent differences between casualty and bystander responses.
Abstract:A method of finding and classifying various components and objects in a design diagram, drawing, or planning layout is proposed. The method automatically finds the objects present in a legend table and finds their position, count and related information with the help of multiple deep neural networks. The method is pre-trained on several drawings or design templates to learn the feature set that may help in representing the new templates. For a template not seen before, it does not require any training with template dataset. The proposed method may be useful in multiple industry applications such as design validation, object count, connectivity of components, etc. The method is generic and domain independent.
Abstract:This paper proposes a framework to measure the important metrics (throughput, delay, packet retransmits, signal strength, etc.) to determine Wi-Fi network performance of mobile robots supported by the Robot Operating Systems (ROS) middleware. We analyze the bidirectional network performance of mobile robots and connected vehicles through an experimental setup, where a mobile robot is communicating vital sensor data such as video streaming from the camera(s) and LiDAR scan values to a command station while it navigates an indoor environment through teleoperated velocity commands received from the command station. The experiments evaluate the performance under 2.4 GHz and 5 GHz channels with different placement of Access Points (AP) with up to two network devices on each side. The discussions and insights from this study apply to the general vehicular networks and the field robotics community, where the wireless network plays a key role in enabling the success of robotic missions.