Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Jung, Dawoon | - |
dc.contributor.author | Nguyen, Mau Dung | - |
dc.contributor.author | Park, Mina | - |
dc.contributor.author | Kim, Jinwook | - |
dc.contributor.author | Mun, Kyung-Ryoul | - |
dc.date.accessioned | 2024-01-19T18:00:25Z | - |
dc.date.available | 2024-01-19T18:00:25Z | - |
dc.date.created | 2021-09-05 | - |
dc.date.issued | 2020-04 | - |
dc.identifier.issn | 1534-4320 | - |
dc.identifier.uri | https://pubs.kist.re.kr/handle/201004/118783 | - |
dc.description.abstract | Human gait has served as a useful barometer of health. Existing studies for automatic categorization of gait have been limited to a binary classification of pathological and non-pathological gait and provided low accuracy in multi-classification. This study aimed to propose a novel approach that can multi-classify gait with no visually discernible difference in characteristics. Sixty-nine participants without gait disturbance were recruited. Twenty-nine of the participants were semi-professional athletes, and 19 were ordinary people. The remaining 21 participants were people with subtle foot deformities. The 3-axis acceleration and the 3-axis angular velocity signals were measured using the inertial measurement units attached to the foot, shank, thigh, and posterior pelvis while walking. The gait spectrograms were acquired by applying time-frequency analyses to the lower body movement signals measured in one stride and used to train the deep convolutional neural network-based classifiers. Four-fold cross-validation was applied to 80% of the total samples and the remaining 20% were used as test data. The foot, shank, and thigh spectrograms enabled complete classification of the three groups even without requiring a sophisticated process for feature engineering. This is the first study that employed the spectrographic approach in gait classification and achieved reliable multi-classification of gait without observable differences in characteristics using the deep convolutional neural networks. | - |
dc.language | English | - |
dc.publisher | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC | - |
dc.subject | DISORDERS | - |
dc.subject | PARAMETERS | - |
dc.subject | SYSTEM | - |
dc.title | Multiple Classification of Gait Using Time-Frequency Representations and Deep Convolutional Neural Networks | - |
dc.type | Article | - |
dc.identifier.doi | 10.1109/TNSRE.2020.2977049 | - |
dc.description.journalClass | 1 | - |
dc.identifier.bibliographicCitation | IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING, v.28, no.4, pp.997 - 1005 | - |
dc.citation.title | IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING | - |
dc.citation.volume | 28 | - |
dc.citation.number | 4 | - |
dc.citation.startPage | 997 | - |
dc.citation.endPage | 1005 | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
dc.identifier.wosid | 000527793800025 | - |
dc.identifier.scopusid | 2-s2.0-85083249253 | - |
dc.relation.journalWebOfScienceCategory | Engineering, Biomedical | - |
dc.relation.journalWebOfScienceCategory | Rehabilitation | - |
dc.relation.journalResearchArea | Engineering | - |
dc.relation.journalResearchArea | Rehabilitation | - |
dc.type.docType | Article | - |
dc.subject.keywordPlus | DISORDERS | - |
dc.subject.keywordPlus | PARAMETERS | - |
dc.subject.keywordPlus | SYSTEM | - |
dc.subject.keywordAuthor | Foot | - |
dc.subject.keywordAuthor | Spectrogram | - |
dc.subject.keywordAuthor | Legged locomotion | - |
dc.subject.keywordAuthor | Continuous wavelet transforms | - |
dc.subject.keywordAuthor | Acceleration | - |
dc.subject.keywordAuthor | Angular velocity | - |
dc.subject.keywordAuthor | Gait classification | - |
dc.subject.keywordAuthor | deep convolutional neural network | - |
dc.subject.keywordAuthor | spectrogram | - |
dc.subject.keywordAuthor | short-time Fourier transform | - |
dc.subject.keywordAuthor | continuous wavelet transform | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.