Intention Recognition Method for Sit-to-Stand and Stand-to-Sit from Electromyogram Signals for Overground Lower-Limb Rehabilitation Robots
- Intention Recognition Method for Sit-to-Stand and Stand-to-Sit from Electromyogram Signals for Overground Lower-Limb Rehabilitation Robots
- 정상훈; 이종민; 김승종; 황요하; 안진웅
- Issue Date
- IEEE/ASME International Conference on Advanced Intelligent Mechatronics
- , 418-421
- This paper presents a framework for classifying sit-to-stand and stand-to-sit from just two channel EMG signals taken from the left leg. Our proposed framework uses linear discriminant analysis (LDA) as the classifier and a multiwindow feature extraction approach termed Consecutive Time-Windowed Feature Extraction (CTFE). We present the prelimnary results from 2 healthy subjects as a proof of concept. With the two tested subjects, we got predictive accuracies above 90%. The results show promise for a framework capable of recognizing the user's intention of sit-to-stand and stand-to-sit. Potential applications include rehabilitation robots for hemiparesis patients and exoskeleton control.
- Appears in Collections:
- KIST Publication > Conference Paper
- Files in This Item:
There are no files associated with this item.
- RIS (EndNote)
- XLS (Excel)
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.