Abstract:We present a system for decoding hand movements using surface EMG signals. The interface provides real-time (25 Hz) reconstruction of finger joint angles across 20 degrees of freedom, designed for upper limb amputees. Our offline analysis shows 0.8 correlation between predicted and actual hand movements. The system functions as an integrated pipeline with three key components: (1) a VR-based data collection platform, (2) a transformer-based model for EMG-to-motion transformation, and (3) a real-time calibration and feedback module called ALVI Interface. Using eight sEMG sensors and a VR training environment, users can control their virtual hand down to finger joint movement precision, as demonstrated in our video: youtube link.
Abstract:Motor brain-computer interface (BCI) development relies critically on neural time series decoding algorithms. Recent advances in deep learning architectures allow for automatic feature selection to approximate higher-order dependencies in data. This article presents the FingerFlex model - a convolutional encoder-decoder architecture adapted for finger movement regression on electrocorticographic (ECoG) brain data. State-of-the-art performance was achieved on a publicly available BCI competition IV dataset 4 with a correlation coefficient between true and predicted trajectories up to 0.74. The presented method provides the opportunity for developing fully-functional high-precision cortical motor brain-computer interfaces.