Gait Estimation from Anatomical Foot Parameters Measured by a Foot Feature Measurement System using a Deep Neural Network Model

Authors
Mun, Kyung-RyoulSong, GyuwonChun, SungkukKim, Jinwook
Issue Date
2018-06-29
Publisher
NATURE PUBLISHING GROUP
Citation
SCIENTIFIC REPORTS, v.8
Abstract
An accurate and credible measurement of human gait is essential in multiple areas of medical science and rehabilitation. Yet, the methods currently available are not only arduous but also costly. Researchers who investigated the relationship between foot and gait parameters have found that the two parameters are closely interrelated and suggested that measuring foot characteristics can be an alternative to the strenuous quantification currently in use. This study aims to verify the potential of foot characteristics in predicting the actual gait temporo-spatial parameters and to develop a deep neural network (DNN) model that can estimate and quantify the gait temporo-spatial parameters from foot characteristics. The foot features in sitting, standing, and one-leg standing conditions of 42 subjects were used as the input data and gait temporo-spatial parameters at fast, normal, and slow speed were set as the output of the DNN regressor. With the prediction accuracy of 95% or higher, the feasibility of the developed model was verified. This study might be the first in attempting experimental verification of the foot features serving as predictors of individual gait. The DNN regressor will help researchers improve the data pool with less labor and expense when some limitations get properly overcome.
Keywords
GROUND REACTION FORCE; PREDICTION; KINEMATICS; REGRESSION; MOMENT; SENSOR; GROUND REACTION FORCE; PREDICTION; KINEMATICS; REGRESSION; MOMENT; SENSOR
ISSN
2045-2322
URI
https://pubs.kist.re.kr/handle/201004/121234
DOI
10.1038/s41598-018-28222-2
Appears in Collections:
KIST Article > 2018
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML

qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE