Boosting Inertial-Based Human Activity Recognition With Transformers

IEEE ACCESS(2021)

引用 25|浏览4
暂无评分
摘要
Activity recognition problems such as human activity recognition and smartphone location recognition can improve the accuracy of different navigation or healthcare tasks, which rely solely on inertial sensors. Current learning-based approaches for activity recognition from inertial data employ convolutional neural networks or long short term memory architectures. Recently, Transformers were shown to outperform these architectures for sequence analysis tasks. This work presents an activity recognition model based on Transformers which offers an improved and general framework for learning activity recognition tasks. For evaluation purposes, several datasets, with more than 27 hours of inertial data recordings collected by 91 users, are employed. Those datasets represent different user activity scenarios with varying difficulty. The proposed approach consistently achieves better accuracy and generalizes better across all examined datasets and scenarios. A codebase implementing the described framework is available at: https://github.com/yolish/har-with-imu-transformer.
更多
查看译文
关键词
Legged locomotion, Task analysis, Activity recognition, Belts, Stairs, Accelerometers, Magnetic heads, Human activity recognition, smartphone location recognition, inertial sensors, pedestrian dead reckoning, convolutional neural networks, Transformers, sequence analysis
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要