Robot Rotation Estimation Using Spherical Moments in Neural Networks

2022 IEEE Applied Imagery Pattern Recognition Workshop (AIPR)(2022)

引用 0|浏览2
暂无评分
摘要
This work proposes a method to estimate mobile robot rotation using spherical moments and machine learning. A polynomial fisheye camera model has been used to generate each image considering flying camera along a pre-planned trajectory. A dataset with 2400 images is created and labeled using Blender. The images are processed before it is fed into the neural network using 20 masks to reduces errors caused by non-common parts of the edge in image. afterwards, the spherical moments are obtained, as well as the triplets and the basis. Finally the basis is converted into a quaternion as input of network. Because for a neural network, a smaller number of inputs means faster training and higher accuracy. For the training phase, a strategy of data augmentation is applied. The data set is divided into many fragments (folders) according to a folder size that can be adjusted. Among them, in each training set, a sliding window is used to expend combinations within a certain range to calculate the rotation between two images. In tne end, the model is tested with synthesized fisheye camera data from Blender. The used simulated environment contains different light and weather conditions like fog or bad lighting situation. The proposed method in this paper is compared with different feature-based methods like FAST, ORB, SURF etc., which cannot successfully extract the feature points, to prove the performance of this novel approach. The results show that the method proposed in this paper can still accurately estimate
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要