Moving and Adapting with a Learning Exoskeleton
Hosea Siu, AA Graduate Student, MVL
This talk is about the implementation, operation, and consequences of intelligent exoskeletons – teammates that learn and adapt to the human operators to which they are physically coupled. First, we showed the feasibility of using machine learning methods to recognize user intent in a pick and place task through surface electromyography (sEMG) signals from several sessions where the sEMG sensors have been shifted around. Second, an exoskeleton system was designed that used a learning from demonstration method to train individual sEMG intent mappings that are robust to non-specific sensor placement, subdermal muscle shifts, and user contact with both the external objects and the exoskeleton itself. Third, we conducted a series of pick and place experiments with subjects wearing the exoskeleton and using three different controllers.
We found that learning from demonstration allows the exoskeleton is able to anticipate user movement, that task performance with sEMG control is similar to performance with force control, and that user adaptation occurs at the level of sEMG signaling when using an sEMG-based controller. We also found significant differences in user perception of team fluency that depend on the order in which users experienced the different controllers. These results have implications for future exoskeleton controller design, as well as for procedures for training human-exoskeleton teams.