Full metadata record

DC Field Value Language
dc.contributor.authorChao, Vanyi-
dc.contributor.authorPark, Sungho-
dc.contributor.authorJamsrandorj, Ankhzaya-
dc.contributor.authorJung, Dawoon-
dc.contributor.authorKim, Jinwook-
dc.contributor.authorMun, Kyung-Ryoul-
dc.date.accessioned2026-03-25T05:30:07Z-
dc.date.available2026-03-25T05:30:07Z-
dc.date.created2026-03-24-
dc.date.issued2026-04-
dc.identifier.issn0263-2241-
dc.identifier.urihttps://pubs.kist.re.kr/handle/201004/154450-
dc.description.abstractAccurate measurement of lower-limb joint angles is essential for gait analysis, rehabilitation, and early detection of degenerative disease. Current marker-based approaches using wearable sensors not only require complex setups but also intricate calibrations, hindering widespread application of them in daily life. This study aimed to propose a vision-based estimation of hip, knee, and ankle angles directly from single-view RGB frames. An end-to-end model that integrated a convolutional neural network for spatial feature extraction with a transformer module to capture temporal gait dynamics was built, and a novel loss function that jointly penalizes angle and angular-velocity errors improved overall consistency and accuracy of the estimation. The performance of the developed model was compared with that of the 3D pose method, and its robustness was examined by varying camera perspectives, walking speeds, and abnormality in gait patterns. The proposed model achieved the intraclass correlation coefficients above 0.92 across varying viewing angles, walking speeds, and gait patterns. The model outperformed the two-stage baseline of the 3D pose method. The proposed model not only eliminates the need for complex setups and sensors but is also less prone to inaccuracies compared to the two-stage pipeline method. When applied to clinical and home-based environments, it can offer a cost-effective and reliable solution for gait assessment.-
dc.languageEnglish-
dc.publisherElsevier BV-
dc.titleLower limb joint angle estimation using a single RGB camera-
dc.typeArticle-
dc.identifier.doi10.1016/j.measurement.2026.120772-
dc.description.journalClass1-
dc.identifier.bibliographicCitationMeasurement: Journal of the International Measurement Confederation, v.270-
dc.citation.titleMeasurement: Journal of the International Measurement Confederation-
dc.citation.volume270-
dc.description.isOpenAccessN-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.identifier.wosid001698959300001-
dc.identifier.scopusid2-s2.0-105030649805-
dc.relation.journalWebOfScienceCategoryEngineering, Multidisciplinary-
dc.relation.journalWebOfScienceCategoryInstruments & Instrumentation-
dc.relation.journalResearchAreaEngineering-
dc.relation.journalResearchAreaInstruments & Instrumentation-
dc.type.docTypeArticle-
dc.subject.keywordAuthorCNN-transformer architecture-
dc.subject.keywordAuthorJoint angle estimation-
dc.subject.keywordAuthorLower limb kinematics-
Appears in Collections:
KIST Article > 2026
Export
RIS (EndNote)
XLS (Excel)
XML

qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE