Self-Paced Relational Contrastive Hashing for Large-Scale Image Retrieval

IEEE TRANSACTIONS ON MULTIMEDIA(2024)

引用 0|浏览7
暂无评分
摘要
Supervised deep hashing aims to learn hash functions using label information. Existing methods learn hash functions by employing either pairwise/triplet loss to explore the point-to-point relation or center loss to explore the point-to-class relation. However, these methods overlook the collaboration between the above two kinds of relations and the hardness of pairs. In this work, we propose a novel Self-Paced Relational Contrastive Hashing (SPRCH) method with a single learning objective to capture valuable discriminative information from hard pairs using both the point-to-point and point-to-class relations. To exploit the above two kinds of relations, the Relational Contrastive Hash (RCH) loss is proposed, which ensures that each data anchor is closer to all similar data points and corresponding class centers in the Hamming space compared to dissimilar ones. Moreover, the proposed RCH loss reduces the drastic imbalance between point-to-point pairs and point-to-class pairs by rebalancing their weights. To prioritize hard pairs, a self-paced learning schedule is proposed, assigning higher weights to these pairs in the RCH loss. The self-paced learning schedule assigns dynamic weights to pairs according to their similarities and the training process. In this way, deep hash model can initially learn universal patterns from the entire set of pairs and then gradually acquire more valuable discriminative information from hard pairs. Experimental results on four widely-used image retrieval datasets demonstrate that our proposed SPRCH method significantly outperforms the state-of-the-art supervised deep hash methods.
更多
查看译文
关键词
Contrastive learning,deep hashing,image retrieval,self-paced learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要