Abstract:This work presents a novel rehabilitation framework designed for a therapist, wearing an inertial measurement unit (IMU) suit, to virtually interact with a lower-limb exoskeleton worn by a patient with motor impairments. This framework aims to harmonize the skills and knowledge of the therapist with the capabilities of the exoskeleton. The therapist can guide the patient's movements by moving their own joints and making real-time adjustments to meet the patient's needs, while reducing the physical effort of the therapist. This eliminates the need for a predefined trajectory for the patient to follow, as in conventional robotic gait training. For the virtual interaction medium between the therapist and patient, we propose an impedance profile that is stiff at low frequencies and less stiff at high frequencies, that can be tailored to individual patient needs and different stages of rehabilitation. The desired interaction torque from this medium is commanded to a whole-exoskeleton closed-loop compensation controller. The proposed virtual interaction framework was evaluated with a pair of unimpaired individuals in different teacher-student gait training exercises. Results show the proposed interaction control effectively transmits haptic cues, informing future applications in rehabilitation scenarios.
Abstract:In the control of lower-limb exoskeletons with feet, the phase in the gait cycle can be identified by monitoring the weight distribution at the feet. This phase information can be used in the exoskeleton's controller to compensate the dynamics of the exoskeleton and to assign impedance parameters. Typically the weight distribution is calculated using data from sensors such as treadmill force plates or insole force sensors. However, these solutions increase both the setup complexity and cost. For this reason, we propose a deep-learning approach that uses a short time window of joint kinematics to predict the weight distribution of an exoskeleton in real time. The model was trained on treadmill walking data from six users wearing a four-degree-of-freedom exoskeleton and tested in real time on three different users wearing the same device. This test set includes two users not present in the training set to demonstrate the model's ability to generalize across individuals. Results show that the proposed method is able to fit the actual weight distribution with R2=0.9 and is suitable for real-time control with prediction times less than 1 ms. Experiments in closed-loop exoskeleton control show that deep-learning-based weight distribution estimation can be used to replace force sensors in overground and treadmill walking.