Abstract:An end-to-end hardware-software pipeline is introduced to automatize ergonomics assessment in industrial workplaces. The proposed modular solution can interoperate with commercial systems throughout the ergonomics assessment phases involved in the process. The pipeline includes custom-designed Inertial Measurement Unit (IMU) sensors, two real-time worker movement acquisition tools, inverse kinematics processing and Rapid Upper Limb Assessment (RULA) report generation. It is based on free tools such as Unity3D and OpenSim to avoid the problems derived from using proprietary technologies, such as security decisions being made under "black box" conditions. Experiments were conducted in an automotive factory in a workplace with WMSDs risk among workers. The proposed solution obtained comparable results to a gold standard solution, reaching measured joint angles a 0.95 cross-correlation and a Root Mean Square Error (RMSE) lower than 10 for elbows and 12 for shoulders between both systems. In addition, the global RULA score difference is lower than 5% between both systems. This work provides a low-cost solution for WMSDs risk assessment in the workplace to reduce musculoskeletal disorders and associated sick leave in industry, impacting the health of workers in the long term. Our study can ease further research and popularize the use of wearable systems for ergonomics analysis allowing these workplace prevention systems to reach different industrial environments.
Abstract:This study presents ME-WARD (Multimodal Ergonomic Workplace Assessment and Risk from Data), a novel system for ergonomic assessment and musculoskeletal risk evaluation that implements the Rapid Upper Limb Assessment (RULA) method. ME-WARD is designed to process joint angle data from motion capture systems, including inertial measurement unit (IMU)-based setups, and deep learning human body pose tracking models. The tool's flexibility enables ergonomic risk assessment using any system capable of reliably measuring joint angles, extending the applicability of RULA beyond proprietary setups. To validate its performance, the tool was tested in an industrial setting during the assembly of conveyor belts, which involved high-risk tasks such as inserting rods and pushing conveyor belt components. The experiments leveraged gold standard IMU systems alongside a state-of-the-art monocular 3D pose estimation system. The results confirmed that ME-WARD produces reliable RULA scores that closely align with IMU-derived metrics for flexion-dominated movements and comparable performance with the monocular system, despite limitations in tracking lateral and rotational motions. This work highlights the potential of integrating multiple motion capture technologies into a unified and accessible ergonomic assessment pipeline. By supporting diverse input sources, including low-cost video-based systems, the proposed multimodal approach offers a scalable, cost-effective solution for ergonomic assessments, paving the way for broader adoption in resource-constrained industrial environments.
Abstract:Advances in machine learning and wearable sensors offer new opportunities for capturing and analyzing human movement outside specialized laboratories. Accurate assessment of human movement under real-world conditions is essential for telemedicine, sports science, and rehabilitation. This preclinical benchmark compares monocular video-based 3D human pose estimation models with inertial measurement units (IMUs), leveraging the VIDIMU dataset containing a total of 13 clinically relevant daily activities which were captured using both commodity video cameras and five IMUs. During this initial study only healthy subjects were recorded, so results cannot be generalized to pathological cohorts. Joint angles derived from state-of-the-art deep learning frameworks (MotionAGFormer, MotionBERT, MMPose 2D-to-3D pose lifting, and NVIDIA BodyTrack) were evaluated against joint angles computed from IMU data using OpenSim inverse kinematics following the Human3.6M dataset format with 17 keypoints. Among them, MotionAGFormer demonstrated superior performance, achieving the lowest overall RMSE ($9.27\deg \pm 4.80\deg$) and MAE ($7.86\deg \pm 4.18\deg$), as well as the highest Pearson correlation ($0.86 \pm 0.15$) and the highest coefficient of determination $R^{2}$ ($0.67 \pm 0.28$). The results reveal that both technologies are viable for out-of-the-lab kinematic assessment. However, they also highlight key trade-offs between video- and sensor-based approaches including costs, accessibility, and precision. This study clarifies where off-the-shelf video models already provide clinically promising kinematics in healthy adults and where they lag behind IMU-based estimates while establishing valuable guidelines for researchers and clinicians seeking to develop robust, cost-effective, and user-friendly solutions for telehealth and remote patient monitoring.
Abstract:Recent studies confirm the applicability of Inertial Measurement Unit (IMU)-based systems for human motion analysis. Notwithstanding, high-end IMU-based commercial solutions are yet too expensive and complex to democratize their use among a wide range of potential users. Less featured entry-level commercial solutions are being introduced in the market, trying to fill this gap, but still present some limitations that need to be overcome. At the same time, there is a growing number of scientific papers using not commercial, but custom do-it-yourself IMU-based systems in medical and sports applications. Even though these solutions can help to popularize the use of this technology, they have more limited features and the description on how to design and build them from scratch is yet too scarce in the literature. The aim of this work is two-fold: (1) Proving the feasibility of building an affordable custom solution aimed at simultaneous multiple body parts orientation tracking; while providing a detailed bottom-up description of the required hardware, tools, and mathematical operations to estimate and represent 3D movement in real-time. (2) Showing how the introduction of a custom 2.4 GHz communication protocol including a channel hopping strategy can address some of the current communication limitations of entry-level commercial solutions. The proposed system can be used for wireless real-time human body parts orientation tracking with up to 10 custom sensors, at least at 50 Hz. In addition, it provides a more reliable motion data acquisition in Bluetooth and Wi-Fi crowded environments, where the use of entry-level commercial solutions might be unfeasible. This system can be used as a groundwork for developing affordable human motion analysis solutions that do not require an accurate kinematic analysis.
Abstract:Human activity recognition and clinical biomechanics are challenging problems in physical telerehabilitation medicine. However, most publicly available datasets on human body movements cannot be used to study both problems in an out-of-the-lab movement acquisition setting. The objective of the VIDIMU dataset is to pave the way towards affordable patient tracking solutions for remote daily life activities recognition and kinematic analysis. The dataset includes 13 activities registered using a commodity camera and five inertial sensors. The video recordings were acquired in 54 subjects, of which 16 also had simultaneous recordings of inertial sensors. The novelty of VIDIMU lies in: i) the clinical relevance of the chosen movements, ii) the combined utilization of affordable video and custom sensors, and iii) the implementation of state-of-the-art tools for multimodal data processing of 3D body pose tracking and motion reconstruction in a musculoskeletal model from inertial data. The validation confirms that a minimally disturbing acquisition protocol, performed according to real-life conditions can provide a comprehensive picture of human joint angles during daily life activities.