Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Chu, Jun-Uk | - |
dc.contributor.author | Song, Kang-Il | - |
dc.contributor.author | Han, Sungmin | - |
dc.contributor.author | Lee, Soo Hyun | - |
dc.contributor.author | Kang, Ji Yoon | - |
dc.contributor.author | Hwang, Dosik | - |
dc.contributor.author | Suh, Jun-Kyo Francis | - |
dc.contributor.author | Choi, Kuiwon | - |
dc.contributor.author | Youn, Inchan | - |
dc.date.accessioned | 2024-01-20T12:31:24Z | - |
dc.date.available | 2024-01-20T12:31:24Z | - |
dc.date.created | 2021-09-05 | - |
dc.date.issued | 2013-05 | - |
dc.identifier.issn | 0967-3334 | - |
dc.identifier.uri | https://pubs.kist.re.kr/handle/201004/128096 | - |
dc.description.abstract | Cutaneous afferent activities recorded by a nerve cuff electrode have been used to detect the stance phase in a functional electrical stimulation system for foot drop correction. However, the implantation procedure was difficult, as the cuff electrode had to be located on the distal branches of a multi-fascicular nerve to exclude muscle afferent and efferent activities. This paper proposes a new gait phase detection scheme that can be applied to a proximal nerve root that includes cutaneous afferent fibers as well as muscle afferent and efferent fibers. To test the feasibility of this scheme, electroneurogram (ENG) signals were measured from the rat sciatic nerve during treadmill walking at several speeds, and the signal properties of the sciatic nerve were analyzed for a comparison with kinematic data from the ankle joint. On the basis of these experiments, a wavelet packet transform was tested to define a feature vector from the sciatic ENG signals according to the gait phases. We also propose a Gaussian mixture model (GMM) classifier and investigate whether it could be used successfully to discriminate feature vectors into the stance and swing phases. In spite of no significant differences in the rectified bin-integrated values between the stance and swing phases, the sciatic ENG signals could be reliably classified using the proposed wavelet packet transform and GMM classification methods. | - |
dc.language | English | - |
dc.publisher | IOP PUBLISHING LTD | - |
dc.subject | EVENT DETECTION | - |
dc.subject | CLASSIFICATION | - |
dc.subject | INFORMATION | - |
dc.subject | PATTERN | - |
dc.subject | SCHEME | - |
dc.title | Gait phase detection from sciatic nerve recordings in functional electrical stimulation systems for foot drop correction | - |
dc.type | Article | - |
dc.identifier.doi | 10.1088/0967-3334/34/5/541 | - |
dc.description.journalClass | 1 | - |
dc.identifier.bibliographicCitation | PHYSIOLOGICAL MEASUREMENT, v.34, no.5, pp.541 - 565 | - |
dc.citation.title | PHYSIOLOGICAL MEASUREMENT | - |
dc.citation.volume | 34 | - |
dc.citation.number | 5 | - |
dc.citation.startPage | 541 | - |
dc.citation.endPage | 565 | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
dc.identifier.wosid | 000317953800007 | - |
dc.identifier.scopusid | 2-s2.0-84876911855 | - |
dc.relation.journalWebOfScienceCategory | Biophysics | - |
dc.relation.journalWebOfScienceCategory | Engineering, Biomedical | - |
dc.relation.journalWebOfScienceCategory | Physiology | - |
dc.relation.journalResearchArea | Biophysics | - |
dc.relation.journalResearchArea | Engineering | - |
dc.relation.journalResearchArea | Physiology | - |
dc.type.docType | Article | - |
dc.subject.keywordPlus | EVENT DETECTION | - |
dc.subject.keywordPlus | CLASSIFICATION | - |
dc.subject.keywordPlus | INFORMATION | - |
dc.subject.keywordPlus | PATTERN | - |
dc.subject.keywordPlus | SCHEME | - |
dc.subject.keywordAuthor | gait phase detection | - |
dc.subject.keywordAuthor | nerve cuff electrode | - |
dc.subject.keywordAuthor | functional electrical stimulation | - |
dc.subject.keywordAuthor | foot drop correction | - |
dc.subject.keywordAuthor | Gaussian mixture model | - |
dc.subject.keywordAuthor | iterative local search algorithm | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.