Full metadata record

DC Field Value Language
dc.contributor.authorShin, Hyesoo-
dc.contributor.authorKim, Sangdo-
dc.contributor.authorKim, Sunwoo-
dc.contributor.authorLee, Jongwon-
dc.contributor.authorKim, Jinkyu-
dc.contributor.authorKim, KangGeon-
dc.date.accessioned2025-09-17T02:02:00Z-
dc.date.available2025-09-17T02:02:00Z-
dc.date.created2025-09-16-
dc.date.issued2025-10-
dc.identifier.urihttps://pubs.kist.re.kr/handle/201004/153166-
dc.description.abstractLower limb exoskeletons assist users by supporting joint movements. Since joint motion patterns vary depending on how the user moves, accurately recognizing the type of movement (locomotion mode) is crucial for controlling the exoskeleton and ensuring user safety. Inspired by how humans use multiple types of sensory information to control movement, we developed a multi-modal locomotion mode recognition (LMR) system that uses both mechanical and visual sensor data to identify locomotion modes. Our approach utilizes two fusion methods: intermediate fusion, which combines the data in the form of features, and late fusion, which integrates the sensor data by averaging the recognition results from each sensor. By fusing these two different modalities, the prediction accuracy improved by an average of 11.7% with the test data. Through comparisons with uni-modal LMR systems that rely on a single type of sensor data for locomotion mode recognition, we found that the improved performance of the multi-modal LMR system is due to the visual information's ability to generalize different gait patterns across users and the mechanical sensor data's consistency within the same classes.-
dc.languageEnglish-
dc.publisherInstitute of Electrical and Electronics Engineers Inc.-
dc.titleMulti-Modal Locomotion Mode Recognition in the Real World for Robotic Hip Complex Exoskeletons-
dc.typeArticle-
dc.identifier.doi10.1109/LRA.2025.3597482-
dc.description.journalClass1-
dc.identifier.bibliographicCitationIEEE Robotics and Automation Letters, v.10, no.10, pp.9718 - 9725-
dc.citation.titleIEEE Robotics and Automation Letters-
dc.citation.volume10-
dc.citation.number10-
dc.citation.startPage9718-
dc.citation.endPage9725-
dc.description.isOpenAccessN-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.identifier.wosid001554444400023-
dc.identifier.scopusid2-s2.0-105013293270-
dc.relation.journalWebOfScienceCategoryRobotics-
dc.relation.journalResearchAreaRobotics-
dc.type.docTypeArticle-
dc.subject.keywordPlusGAIT-
dc.subject.keywordAuthorWearable robotics-
dc.subject.keywordAuthorsensor fusion-
dc.subject.keywordAuthorsensor fusion-
dc.subject.keywordAuthorembedded systems for robotic and automation-
dc.subject.keywordAuthorembedded systems for robotic and automation-
dc.subject.keywordAuthorembedded systems for robotic and automation-
Appears in Collections:
KIST Article > Others
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML

qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE