To this day, robotic rehabilitation has not met its promise. It did not revolutionize rehabilitation of patients after stroke or spinal cord injury yet. One of the challenges hampering this goal is the control and communication interface between the human and the machine. Currently, commercial exoskeletons replay pre-defined gait patterns, while research exoskeletons replay optimized torque profiles or assist proportionally to electromyograms (EMG) signals. In most cases, the dynamics of the human musculoskeletal system is ignored, simplified or considered as a black-box. This dissertation goal is to endow wearable robots with numerical representations of the human body to enable robust and intuitive human control of wearable robots. To achieve this goal, a change of paradigm in control of wearable robots is proposed in this dissertation, going away form pure robotic control where the human is driven by the robotic device to a new paradigm where the human drives the robotic device.
This dissertation presents the development of a new human-machine interface (HMI) for control of exoskeletons via a neuromusculoskeletal model driven in real-time by experimental EMGs and joint positions recorded from the user to predict joint torques. These predicted joint torques are then used to assist the user via exoskeletons.
First, the development of a real-time version of the HMI previously created by Sartori et al was accomplished by incorporating a B-spline algorithm for real-time computation of the of muscle-tendon lengths and moment arms from joint positions. Furthermore, the computational efficiency of the HMI was increased so that computational time was brought below the muscle electromechanical delay (i.e. < 50 ms). Further work was done to create real-time inverse kinematics and inverse dynamics pipelines informed by experimentally recorded marker positions and ground reaction forces. This was tested on five healthy subjects where results showed that the developed HMI could estimate muscle-tendon forces and joint torques online with direct validation against inverse dynamics (gold standard). Results indicated that our HMI could extrapolate across new movements and new degrees of freedom that were not used to calibrate the model.
Secondly, the developed HMI was employed to enable torque control of wearable exoskeletons. Tests were done on stroke and spinal cord injury patients performing seated rehabilitation motor tasks. Results demonstrated that the HMI translated human bioelectrical muscle activity in exoskeleton control commands leading to reductions in EMG amplitudes as well as variability in the patients' group.
Thirdly, further tests were conducted on locomotion tasks with different modalities (speeds and/or elevations) with healthy users. This experiment proved the possibility to assist positively (i.e. reductions of EMGs and biological torques) across different locomotion tasks and transitions across tasks. Results showed that the total human + exoskeleton torques were always similar between the exoskeleton assisting mode and when the exoskeleton was in minimal impedance (i.e. transparent mode). This means that force transfers between the human and the exoskeleton were created where the provided assistance was fully integrated by the human thereby lowering biological joint torque levels by the same amount as the received torques from the exoskeleton.
The development of this new HMI offers new possibilities for control of robotic devices as well as opens new avenues in assistive wearable robotics.