Equalization Loss v2: A New Gradient Balance Approach for Long-tailed Object Detection

2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021(2021)

引用 125|浏览76
暂无评分
摘要
Recently proposed decoupled training methods emerge as a dominant paradigm for long-tailed object detection. But they require an extra fine-tuning stage, and the disjointed optimization of representation and classifier might lead to suboptimal results. However, end-to-end training methods, like equalization loss (EQL), still perform worse than decoupled training methods. In this paper, we reveal the main issue in long-tailed object detection is the unbalanced gradients between positives and negatives, and find that EQL does not solve it well. To address the problem of Unbalanced gradients, we introduce a new version of equalization loss, called equalization loss v2 (EQL v2), a novel gradient guided reweighing mechanism that rebalances the training process for each category independently and equally. Extensive experiments are performed on the challenging LVIS benchmark. EQI, v2 outperforms origin EQI, by about 4 points overall AP with 14 similar to 18 points improvements on the rare categories. More importantly, it also surpasses decoupled training methods. Without further tuning for the Open Images dataset, EQL v2 improves EQL by 7.3 points AP showing strong generalization ability. Codes have been released at https://github.com/tztztztztz/eq1v2
更多
查看译文
关键词
object detection,new gradient balance approach,long-tailed
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要