A Novel Small Target Detection Strategy: Location Feature Extraction in the Case of Self-Knowledge Distillation

APPLIED SCIENCES-BASEL(2023)

引用 0|浏览0
暂无评分
摘要
Small target detection has always been a hot and difficult point in the field of target detection. The existing detection network has a good effect on conventional targets but a poor effect on small target detection. The main challenge is that small targets have few pixels and are widely distributed in the image, so it is difficult to extract effective features, especially in the deeper neural network. A novel plug-in to extract location features of the small target in the deep network was proposed. Because the deep network has a larger receptive field and richer global information, it is easier to establish global spatial context mapping. The plug-in named location feature extraction establishes the spatial context mapping in the deep network to obtain the global information of scattered small targets in the deep feature map. Additionally, the attention mechanism can be used to strengthen attention to the spatial information. The comprehensive effect of the above two can be utilized to realize location feature extraction in the deep network. In order to improve the generalization of the network, a new self-distillation algorithm was designed for pre-training that could work under self-supervision. The experiment was conducted on the public datasets (Pascal VOC and Printed Circuit Board Defect dataset) and the self-made dedicated small target detection dataset, respectively. According to the diagnosis of the false-positive error distribution, the location error was significantly reduced, which proved the effectiveness of the plug-in proposed for location feature extraction. The mAP results can prove that the detection effect of the network applying the location feature extraction strategy is much better than the original network.
更多
查看译文
关键词
location feature extraction,small target detection,self-knowledge distillation,attention mechanism
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要