View-independent gait events detection using CNN-transformer hybrid network
- Authors
- Ankhzaya, Jamsrandorj; Jung, Da Woon; Konki Sravan Kumar; Arshad, Muhammad Zeeshan; Lim, Hwa sup; Kim, Jinwook; Mun, Kyung Ryoul
- Issue Date
- 2023-11
- Publisher
- Academic Press
- Citation
- Journal of Biomedical Informatics, v.147
- Abstract
- Accurate gait detection is crucial in utilizing the ample health information embedded in it. Vision-based approaches for gait detection have emerged as an alternative to the exacting sensor-based approaches, but their application has been rather limited due to complicated feature engineering processes and heavy reliance on lateral views. Thus, this study aimed to find a simple vision-based approach that is view-independent and accurate. A total of 22 participants performed six different actions representing standard and peculiar gaits, and the videos acquired from these actions were used as the input of the deep learning networks. Four networks, including a 2D convolutional neural network and an attention-based deep learning network, were trained with standard gaits, and their detection performance for both standard and peculiar gaits was assessed using measures including F1-scores. While all networks achieved remarkable detection performance, the CNN-Transformer network achieved the best performance for both standard and peculiar gaits. Little deviation by the speed of actions or view angles was found. The study is expected to contribute to the wider application of vision-based approaches in gait detection and gait-based health monitoring both at home and in clinical settings.
- Keywords
- View -independent method; Gait events detection; Deep learning; Convolutional neural network; Attention -based network
- ISSN
- 1532-0464
- URI
- https://pubs.kist.re.kr/handle/201004/79738
- DOI
- 10.1016/j.jbi.2023.104524
- Appears in Collections:
- KIST Article > 2023
- Files in This Item:
There are no files associated with this item.
- Export
- RIS (EndNote)
- XLS (Excel)
- XML
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.