Power Normalizing Second-order Similarity Network for Few-shot Learning

2019 IEEE Winter Conference on Applications of Computer Vision (WACV)(2018)

引用 38|浏览25
暂无评分
摘要
Second- and higher-order statistics of data points have played an important role in advancing the state of the art on several computer vision problems such as the fine-grained image and scene recognition. However, these statistics need to be passed via an appropriate pooling scheme to obtain the best performance. Power Normalizations are non-linear activation units which enjoy probability-inspired derivations and can be applied in CNNs. In this paper, we propose a similarity learning network leveraging second-order information and Power Normalizations. To this end, we propose several formulations capturing second-order statistics and derive a sigmoid-like Power Normalizing function to demonstrate its interpretability. Our model is trained end-to-end to learn the similarity between the support set and query images for the problem of one- and few-shot learning. The evaluations on Omniglot, miniImagenet and Open MIC datasets demonstrate that this network obtains state-of-the-art results on several few-shot learning protocols.
更多
查看译文
关键词
few-shot learning protocols,Power Normalizing second-order similarity network,computer vision problems,fine-grained image,scene recognition,Power Normalizations,nonlinear activation units,probability-inspired derivations,query images,pooling scheme,sigmoid-like power normalizing function,Omniglot datasets,miniImagenet datasets,Open MIC datasets
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要