TactileSGNet: A Spiking Graph Neural Network for Event-based Tactile Object Recognition

IROS(2020)

引用 22|浏览59
暂无评分
摘要
Tactile perception is crucial for a variety of robot tasks including grasping and in-hand manipulation. New advances in flexible, event-driven, electronic skins may soon endow robots with touch perception capabilities similar to humans. These electronic skins respond asynchronously to changes (e.g., in pressure, temperature), and can be laid out irregularly on the robot's body or end-effector. However, these unique features may render current deep learning approaches such as convolutional feature extractors unsuitable for tactile learning. In this paper, we propose a novel spiking graph neural network for event-based tactile object recognition. To make use of local connectivity of taxels, we present several methods for organizing the tactile data in a graph structure. Based on the constructed graphs, we develop a spiking graph convolutional network. The event-driven nature of spiking neural network makes it arguably more suitable for processing the event-based data. Experimental results on two tactile datasets show that the proposed method outperforms other state-of-the-art spiking methods, achieving high accuracies of approximately 90\% when classifying a variety of different household objects.
更多
查看译文
关键词
event-based tactile object recognition,tactile perception,robot tasks,flexible event-driven,electronic skins,touch perception capabilities,current deep learning approaches,convolutional feature extractors,tactile learning,novel spiking graph neural network,tactile data,graph structure,constructed graphs,spiking graph convolutional network,event-driven nature,event-based data,tactile datasets,state-of-the-art spiking methods
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要