The analysis of gait patterns using a color camera and computer vision
Ключевые слова:
gait analysis, Edinburgh Visual Gait Scale, computer vision, OpenPose.Аннотация
Objective of the study was to validate the approach for determining human walking metrics through the application of neural networks and computer vision techniques.
Methods and structure of the study. The scientific research involved the following steps:
- Capturing video footage using a single RGB camera.
- Calculating the coordinates of key points using the OpenPose neural network and creating a data set.
- Removing any artifacts from the resulting data set using the Hodrick-Prescott filter.
- Identifying the walking cycle.
- Calculating the walking parameters.
The walking parameters of the subject were recorded and evaluated using the Zebris Rehawalk system (system configuration from h/p/cosmos) at the Scientific and Practical Center for Pediatric Neurology of the Moscow Department of Health. The video was captured using a Panasonic HC-VX1 camera. The Body_25 model was employed to determine the spatial positions of the subject's body parts. The parameters were calculated for the sagittal projection using the Edinburgh Scale for Visual Assessment of human walking.
Results and conclusions. According to the calculations, the following values were obtained: the duration of a walking cycle is 1,58 ± 0,92 seconds, the time taken to complete a step is 0,78 ± 0,03 seconds, the length of a step is 41,33 ± 1,92 centimeters, and the speed of walking is 1,91 ± 0,09 kilometers per hour. The movement parameters of the subject's hips, knees, and feet were calculated. When comparing the obtained values with the normative ones, slight deviations from the standard were observed in the subject's walking. The accuracy of the calculations was 0,95. The results demonstrate that computer vision can accurately assess the biomechanics of human movement and can serve as an objective monitoring tool in various fields, including sports, medical diagnostics, and rehabilitation. This approach does not require specialized training, equipment, or facilities, making it easier to monitor human movement indicators in any environment.
Библиографические ссылки
Wang H. et al. Markerless gait analysis through a single camera and computer vision. Journal of Biomechanics. 2024. Vol. 165. pp. 112027. https://doi.org/10.1016/j.jbiomech.2024.112027.
Hatamzadeh M.A et al. kinematic-geometric model based on ankles’ depth trajectory in frontal plane for gait analysis using a single RGB-D camera eh. Journal of Biomechanics. 2022. Vol. 145. pp. 111358. https://doi.org/10.1016/j.jbiomech.2022.111358.
Leardini Al. et al. Kinematic models of lower limb joints for musculo-skeletal modelling and optimization in gait analysis. Journal of Biomechanics. 2017. Vol. 62. pp. 77-86. https://doi.org/10.1016/j.jbiomech.2017.04.029.
Klopfer-Kramer et al. Is. Gait analysis – Available platforms for outcome assessment. Injury. 2020. Vol. 51. Supplement 2. pp. S90-S96. https://doi.org/10.1016/j.injury.2019.11.011.
Kumar M. et al. Gait recognition based on vision systems: A systematic survey. Journal of Visual Communication and Image Representation. 2021. Vol. 75. pp. 103052. https://doi.org/10.1016/j.jvcir.2021.103052.
Ramesh S.H. et al. Automated Implementation of the Edinburgh Visual Gait Score (EVGS) Using OpenPose and Handheld Smartphone Video. Sensors. 2023. Vol. 23. pp. 4839. https://doi.org/10.3390/s23104839.
Zeni Jr. J. A. Two simple methods for determining gait events during treadmill and overground walking using kinematic data. Gait & posture. 2008. Vol. 27. No 4. pp. 710-714.

Дополнительные файлы
Опубликован
Как цитировать
Выпуск
Раздел
Лицензия
Copyright (c) 2025 Theory and Practice of Physical Culture

Это произведение доступно по лицензии Creative Commons «Attribution» («Атрибуция») 4.0 Всемирная.