Data-driven haptic perception for robot-assisted dressing

2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN)(2016)

引用 38|浏览65
暂无评分
摘要
Dressing is an important activity of daily living (ADL) with which many people require assistance due to impairments. Robots have the potential to provide dressing assistance, but physical interactions between clothing and the human body can be complex and difficult to visually observe. We provide evidence that data-driven haptic perception can be used to infer relationships between clothing and the human body during robot-assisted dressing. We conducted a carefully controlled experiment with 12 human participants during which a robot pulled a hospital gown along the length of each person's forearm 30 times. This representative task resulted in one of the following three outcomes: the hand missed the opening to the sleeve; the hand or forearm became caught on the sleeve; or the full forearm successfully entered the sleeve. We found that hidden Markov models (HMMs) using only forces measured at the robot's end effector classified these outcomes with high accuracy. The HMMs' performance generalized well to participants (98.61% accuracy) and velocities (98.61% accuracy) outside of the training data. They also performed well when we limited the force applied by the robot (95.8% accuracy with a 2N threshold), and could predict the outcome early in the process. Despite the lightweight hospital gown, HMMs that used forces in the direction of gravity substantially outperformed those that did not. The best performing HMMs used forces in the direction of motion and the direction of gravity.
更多
查看译文
关键词
data-driven haptic perception,robot-assisted dressing,activity of daily living,hidden Markov models,HMMs,end effector
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要