Graph Neural Networks for Interpretable Tactile Sensing

2022 27th International Conference on Automation and Computing (ICAC)(2022)

引用 2|浏览19
暂无评分
摘要
Fine-grained tactile perception of objects is significant for robots to explore the unstructured environment. Recent years have seen the success of Convolutional Neural Networks (CNNs)-based methods for tactile perception using high-resolution optical tactile sensors. However, CNNs-based approaches may not be efficient for processing tactile image data and have limited interpretability. To this end, we propose a Graph Neural Network (GNN)-based approach for tactile recognition using a soft biomimetic optical tactile sensor. The obtained tactile images can be transformed into graphs, while GNN can be used to analyse the implicit tactile information among the tactile graphs. The experimental results indicate that with the proposed GNN-based method, the maximum tactile recognition accuracy can reach 99.53%. In addition, Gradient-weighted Class Activation Mapping (Grad-CAM) and Unsigned Grad-CAM (UGrad-CAM) methods are used for visual explanations of the models. Compared to traditional CNNs, we demonstrated that the generated features of the GNN-based model are more intuitive and interpretable.
更多
查看译文
关键词
Tactile Sensor,Object Recognition,Graph Convolutional Network,Explainability
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要