Capturing hand function performance at home is essential to measuring the ultimate impact of novel interventions after stroke. Wearable technologies have been used to capture upper limb use ratios outside of clinical environments, however, wrist-worn devices measure arm use rather than hand use, and finger-worn devices may interfere with natural hand movement during activities of daily living. Therefore, an egocentric (first-person) camera was applied to capture natural hand movements of stroke survivors at home. In addition to hand use, the role of the affected hand in bimanual tasks, including stabilization and manipulation, provides distinct information about hand function performance and has not yet been investigated. Machine learning-based computer vision algorithms were used to detect hand-object interactions (hand use) and hand role after stroke from videos. In this thesis, the three objectives were to investigate the feasibility, validity, and user perspectives regarding the use of egocentric video to capture hand use and hand role of stroke survivors at home, respectively.
Two datasets of recorded daily tasks from 24 stroke survivors at home and in a home simulation laboratory were collected and used in the studies. Hand-object interaction detection at home was feasible using a pre-trained deep learning model: the Hand Object Detector. As for hand role classification, other computer vision techniques that track finger movement may be helpful in the future. The measured hand use ratios from home-recorded videos had close to strong correlation with the Motor Activity Log (MAL) and demonstrated that egocentric video can capture hand function performance after stroke at home. Hand role ratios from the annotated bilateral tasks had very weak correlation with the MAL and negative correlations with the Fugl-Meyer Assessment – Upper Extremity and the Action Research Arm Test. Hand role ratio might capture different information about hand function than clinical upper limb assessments. Stroke survivors had high acceptance of using the camera to record daily routines at home and were willing to use it for rehabilitation purposes again. This thesis demonstrates the potential of using egocentric video with computer vision-based algorithms to measure hand function after stroke in the community.