This study evaluated the feasibility of using eye position information to control a rehabilitation robot. Many people with physical disabilities are unable to use standard robotic interfaces, therefore a control method through eye movements is viewed as a potential alternative.
A simple eye position interface was developed which uses electro-oculography to track the user’s eye movements. The user’s field of view was divided into different zones, and commands were sent to the robot depending on the location of the user's gaze. The interface incorporated safety features such as a dwell period, to prevent accidental activation of the lobot, and automatic disengagement of the interface if the user is distracted.
The interface was evaluated through a series of computer exercises, and exercises with a wheelchair-mounted robotic arm. Participants were able to successfully control the movements of the MANUS manipulator using the eye position interface. Computer exercises revealed that the eye position interface did require longer times to target command zones, and incurred more errors than alternative hands-free or movement-free interfaces. However, it appears that more extensive work with the interface may improve performance.
Results indicated that increasing the size of command zones in the field of view decreased the time required to target the zones, but that the location of the zones in the field of view did not affect performance. Blink artifacts were found to increase error rates, and the filtering of blinks was identified as a necessary component of an eye position interface. The need for some form of visual feedback was also identified during the MANUS exercises.
Use of the eye position interface was found to cause fatigue during computer exercises, but not while controlling the MANUS manipulator.
The issues identified by this study lay the groundwork for future research into the effective use of eye position information to control rehabilitation robots. This study has demonstrated the feasibility of the approach, and encourages further research.