Abstract:Industrial robots become increasingly prevalent, resulting in a growing need for intuitive, comforting human-robot collaboration. We present a user-aware robotic system that adapts to operator behavior in real time while non-intrusively monitoring physiological signals to create a more responsive and empathetic environment. Our prototype dynamically adjusts robot speed and movement patterns while measuring operator pupil dilation and proximity. Our user study compares this adaptive system to a non-adaptive counterpart, and demonstrates that the adaptive system significantly reduces both perceived and physiologically measured cognitive load while enhancing usability. Participants reported increased feelings of comfort, safety, trust, and a stronger sense of collaboration when working with the adaptive robot. This highlights the potential of integrating real-time physiological data into human-robot interaction paradigms. This novel approach creates more intuitive and collaborative industrial environments where robots effectively 'read' and respond to human cognitive states, and we feature all data and code for future use.
Abstract:Electromechanical systems manage physical processes through a network of inter-connected components. Today, programming the interactions required for coordinating these components is largely a manual process. This process is time-consuming and requires manual adaptation when system features change. To overcome this issue, we use autonomous software agents that process semantic descriptions of the system to determine coordination requirements and constraints; on this basis, they then interact with one another to control the system in a decentralized and coordinated manner.Our core insight is that coordination requirements between individual components are, ultimately, largely due to underlying physical interdependencies between the components, which can be (and, in many cases, already are) semantically modeled in automation projects. Agents then use hypermedia to discover, at run time, the plans and protocols required for enacting the coordination. A key novelty of our approach is the use of hypermedia-driven interaction: it reduces coupling in the system and enables its run-time adaptation as features change.
Abstract:Hypermedia APIs enable the design of reusable hypermedia clients that discover and exploit affordances on the Web. However, the reusability of such clients remains limited since they cannot plan and reason about interaction. This paper provides a conceptual bridge between hypermedia-driven affordance exploitation on the Web and methods for representing and reasoning about actions that have been extensively explored for Multi-Agent Systems (MAS) and, more broadly, Artificial Intelligence. We build on concepts and methods from Affordance Theory and Human-Computer Interaction that support interaction efficiency in open and evolvable environments to introduce signifiers as a first-class abstraction in Web-based MAS: Signifiers are designed with respect to the agent-environment context of their usage and enable agents with heterogeneous abilities to act and to reason about action. We define a formal model for the contextual exposure of signifiers in hypermedia environments that aims to drive affordance exploitation. We demonstrate our approach with a prototypical Web-based MAS where two agents with different reasoning abilities proactively discover how to interact with their environment by perceiving only the signifiers that fit their abilities. We show that signifier exposure can be inherently managed based on the dynamic agent-environment context towards facilitating effective and efficient interactions on the Web.