Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Ankhzaya, Jamsrandorj | - |
dc.contributor.author | Jung, Da Woon | - |
dc.contributor.author | Konki Sravan Kumar | - |
dc.contributor.author | Arshad, Muhammad Zeeshan | - |
dc.contributor.author | Lim, Hwa sup | - |
dc.contributor.author | Kim, Jinwook | - |
dc.contributor.author | Mun, Kyung Ryoul | - |
dc.date.accessioned | 2024-01-12T06:32:13Z | - |
dc.date.available | 2024-01-12T06:32:13Z | - |
dc.date.created | 2023-11-28 | - |
dc.date.issued | 2023-11 | - |
dc.identifier.issn | 1532-0464 | - |
dc.identifier.uri | https://pubs.kist.re.kr/handle/201004/79738 | - |
dc.description.abstract | Accurate gait detection is crucial in utilizing the ample health information embedded in it. Vision-based approaches for gait detection have emerged as an alternative to the exacting sensor-based approaches, but their application has been rather limited due to complicated feature engineering processes and heavy reliance on lateral views. Thus, this study aimed to find a simple vision-based approach that is view-independent and accurate. A total of 22 participants performed six different actions representing standard and peculiar gaits, and the videos acquired from these actions were used as the input of the deep learning networks. Four networks, including a 2D convolutional neural network and an attention-based deep learning network, were trained with standard gaits, and their detection performance for both standard and peculiar gaits was assessed using measures including F1-scores. While all networks achieved remarkable detection performance, the CNN-Transformer network achieved the best performance for both standard and peculiar gaits. Little deviation by the speed of actions or view angles was found. The study is expected to contribute to the wider application of vision-based approaches in gait detection and gait-based health monitoring both at home and in clinical settings. | - |
dc.language | English | - |
dc.publisher | Academic Press | - |
dc.title | View-independent gait events detection using CNN-transformer hybrid network | - |
dc.type | Article | - |
dc.identifier.doi | 10.1016/j.jbi.2023.104524 | - |
dc.description.journalClass | 1 | - |
dc.identifier.bibliographicCitation | Journal of Biomedical Informatics, v.147 | - |
dc.citation.title | Journal of Biomedical Informatics | - |
dc.citation.volume | 147 | - |
dc.description.isOpenAccess | N | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
dc.identifier.wosid | 001101335600001 | - |
dc.relation.journalWebOfScienceCategory | Computer Science, Interdisciplinary Applications | - |
dc.relation.journalWebOfScienceCategory | Medical Informatics | - |
dc.relation.journalResearchArea | Computer Science | - |
dc.relation.journalResearchArea | Medical Informatics | - |
dc.type.docType | Article | - |
dc.subject.keywordAuthor | View -independent method | - |
dc.subject.keywordAuthor | Gait events detection | - |
dc.subject.keywordAuthor | Deep learning | - |
dc.subject.keywordAuthor | Convolutional neural network | - |
dc.subject.keywordAuthor | Attention -based network | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.