Cross-Subject and Cross-Modal Transfer for Generalized Abnormal Gait Pattern Recognition

IEEE Transactions on Neural Networks and Learning Systems(2021)

引用 46|浏览376
暂无评分
摘要
For abnormal gait recognition, pattern-specific features indicating abnormalities are interleaved with the subject-specific differences representing biometric traits. Deep representations are, therefore, prone to overfitting, and the models derived cannot generalize well to new subjects. Furthermore, there is limited availability of abnormal gait data obtained from precise Motion Capture (Mocap) systems because of regulatory issues and slow adaptation of new technologies in health care. On the other hand, data captured from markerless vision sensors or wearable sensors can be obtained in home environments, but noises from such devices may prevent the effective extraction of relevant features. To address these challenges, we propose a cascade of deep architectures that can encode cross-modal and cross-subject transfer for abnormal gait recognition. Cross-modal transfer maps noisy data obtained from RGBD and wearable sensors to accurate 4-D representations of the lower limb and joints obtained from the Mocap system. Subsequently, cross-subject transfer allows disentangling subject-specific from abnormal pattern-specific gait features based on a multiencoder autoencoder architecture. To validate the proposed methodology, we obtained multimodal gait data based on a multicamera motion capture system along with synchronized recordings of electromyography (EMG) data and 4-D skeleton data extracted from a single RGBD camera. Classification accuracy was improved significantly in both Mocap and noisy modalities.
更多
查看译文
关键词
Algorithms,Biomechanical Phenomena,Biometry,Computer Systems,Deep Learning,Electromyography,Gait,Gait Disorders, Neurologic,Home Environment,Humans,Imaging, Three-Dimensional,Joints,Lower Extremity,Neural Networks, Computer,Pattern Recognition, Automated,Reproducibility of Results,Wearable Electronic Devices
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要