Enhancing Locomotion Recognition with Specialized Features and Map Information via XGBoost

Jiebi Deng, Jingqiu Xu, Zicheng Sun, Danning Li, Hongxuan Guo,Yuanyuan Zhang,Xiaoling Lu

UbiComp/ISWC '23 Adjunct: Adjunct Proceedings of the 2023 ACM International Joint Conference on Pervasive and Ubiquitous Computing & the 2023 ACM International Symposium on Wearable Computing(2023)

引用 0|浏览0
暂无评分
摘要
The goal of Sussex-Huawei Locomotion-Transportation (SHL) recognition challenge in 2023 is to recognize 8 modes of locomotion and transportation (activities) in a user-independent manner based on motion and GPS sensor data. The main challenges of this competition are sensor diversity, timestamp asynchrony, and the unknown positions of sensors in the test set. We, team "WinGPT", construct special features like velocity from the raw dataset, and extract various features from both time domain and frequency domain. Additionally, this article calculates the distance between users and the nearest places or roads as a feature using map information obtained from OpenStreetMap. We use a dataset with a total of 202 features to train classical machine learning models such as decision tree, random forest, LightGBM, and XGBoost, among which the XGBoost model performs the best, achieving a macro F1 score of 78.95% on the validation set. Moreover, based on our predictions, we determine that the sensor location in the test set is positioned on the hand. Through a post-processing procedure applied to the model, we ultimately achieve a final macro F1 score of 90.86% on the validation set from the hand. In addition, we open the source code of feature extraction and model training and publish it on GitHub: https://github.com/Therebe123/SHL2023.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要