AUTHOR=Viswakumar Aditya , Rajagopalan Venkateswaran , Ray Tathagata , Gottipati Pranitha , Parimi Chandu TITLE=Development of a Robust, Simple, and Affordable Human Gait Analysis System Using Bottom-Up Pose Estimation With a Smartphone Camera JOURNAL=Frontiers in Physiology VOLUME=12 YEAR=2022 URL=https://www.frontiersin.org/journals/physiology/articles/10.3389/fphys.2021.784865 DOI=10.3389/fphys.2021.784865 ISSN=1664-042X ABSTRACT=
Gait analysis is used in many fields such as Medical Diagnostics, Osteopathic medicine, Comparative and Sports-related biomechanics, etc. The most commonly used system for capturing gait is the advanced video camera-based passive marker system such as VICON. However, such systems are expensive, and reflective markers on subjects can be intrusive and time-consuming. Moreover, the setup of markers for certain rehabilitation patients, such as people with stroke or spinal cord injuries, could be difficult. Recently, some markerless systems were introduced to overcome the challenges of marker-based systems. However, current markerless systems have low accuracy and pose other challenges in gait analysis with people in long clothing, hiding the gait kinematics. The present work attempts to make an affordable, easy-to-use, accurate gait analysis system while addressing all the mentioned issues. The system in this study uses images from a video taken with a smartphone camera (800 × 600 pixels at an average rate of 30 frames per second). The system uses OpenPose, a 2D real-time multi-person keypoint detection technique. The system learns to associate body parts with individuals in the image using Convolutional Neural Networks (CNNs). This bottom-up system achieves high accuracy and real-time performance, regardless of the number of people in the image. The proposed system is called the “OpenPose based Markerless Gait Analysis System” (OMGait). Ankle, knee, and hip flexion/extension angle values were measured using OMGait in 16 healthy volunteers under different lighting and clothing conditions. The measured kinematic values were compared with a standard video camera based normative dataset and data from a markerless MS Kinect system. The mean absolute error value of the joint angles from the proposed system was less than 90 for different lighting conditions and less than 110 for different clothing conditions compared to the normative dataset. The proposed system is adequate in measuring the kinematic values of the ankle, knee, and hip. It also performs better than the markerless systems like MS Kinect that fail to measure the kinematics of ankle, knee, and hip joints under dark and bright light conditions and in subjects with long robe clothing.