SmartAct: Energy Efficient and Real-Time Hand-to-Mouth Gesture Detection Using Wearable RGB-T

2022 IEEE-EMBS International Conference on Wearable and Implantable Body Sensor Networks (BSN)(2022)

引用 2|浏览6
暂无评分
摘要
Researchers have been leveraging wearable cameras to both visually confirm and automatically detect individuals’ eating habits. However, energy-intensive tasks such as continuously collecting and storing RGB images in memory, or running algorithms in real-time to automate detection of eating, greatly impacts battery life. Since eating moments are spread sparsely throughout the day, battery life can be mitigated by recording and processing data only when there is a high likelihood of eating. We present a framework comprising a golf-ball sized wearable device using a low-powered thermal sensor array and real-time activation algorithm that activates high-energy tasks when a hand-to-mouth gesture is confirmed by the thermal sensor array. The high-energy tasks tested are turning on the RGB camera (Trigger RGB mode) and running inference on an on-device machine learning model (Trigger ML mode). Our experimental setup involved the design of a wearable camera, 6 participants collecting 18 hours of data with and without eating, the implementation of a feeding gesture detection algorithm on-device, and measures of power saving using our activation method. Our activation algorithm demonstrates an average of at-least 31.5% increase in battery life time, with minimal drop of recall (5%) and without impacting the accuracy of detecting eating (a slight 4.1% increase in F1-Score).
更多
查看译文
关键词
feeding gesture detection algorithm on-device,real-time hand-to-mouth gesture detection,wearable RGB-t,wearable camera,energy-intensive tasks,RGB images,wearable device,low-powered thermal sensor array,real-time activation algorithm,high-energy tasks,RGB camera,Trigger RGB mode,on-device machine learning model,Trigger ML mode,golf-ball sized wearable device,time 18.0 hour
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要