An End-to-End Spiking Neural Network Platform for Edge Robotics: From Event-Cameras to Central Pattern Generation

IEEE Transactions on Cognitive and Developmental Systems(2022)

引用 8|浏览14
暂无评分
摘要
Learning to adapt one’s gait with environmental changes plays an essential role in the locomotion of legged robots which remains challenging for constrained computing resources and energy budget, as in the case of edge-robots. Recent advances in bio-inspired vision with dynamic vision sensors (DVSs) and associated neuromorphic processing can provide promising solutions for end-to-end sensing, cognition, and control tasks. However, such bio-mimetic closed-loop robotic systems based on event-based visual sensing and actuation in the form of spiking neural networks (SNNs) have not been well explored. In this work, we program the weights of a bio-mimetic multigait central pattern generator (CPG) and couple it with DVS-based visual data processing to show a spike-only closed-loop robotic system for a prey-tracking scenario. We first propose a supervised learning rule based on stochastic weight updates to produce a multigait producing spiking-CPG (SCPG) for hexapod robot locomotion. We then actuate the SCPG to seamlessly transition between the gaits for a nearest prey tracking task by incorporating SNN-based visual processing for input event-data generated by the DVS. This for the first time, demonstrates the natural coupling of event data flow from event-camera through SNN and neuromorphic locomotion. Thus, we exploit bio-mimetic dynamics and energy advantages of spike-based processing for autonomous edge-robotics.
更多
查看译文
关键词
Central pattern generation,dynamic vision sensor (DVS) cameras,edge intelligence,hexapods,spiking neural networks (SNNs)
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要