Unsupervised Feature Learning via Non-Parametric Instance Discrimination

2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition(2018)

引用 3674|浏览872
暂无评分
摘要
Neural net classifiers trained on data with annotated class labels can also capture apparent visual similarity among categories without being directed to do so. We study whether this observation can be extended beyond the conventional domain of supervised learning: Can we learn a good feature representation that captures apparent similarity among instances, instead of classes, by merely asking the feature to be discriminative of individual instances? We formulate this intuition as a non-parametric classification problem at the instance-level, and use noise contrastive estimation to tackle the computational challenges imposed by the large number of instance classes. Our experimental results demonstrate that, under unsupervised learning settings, our method surpasses the stateof-the-art on ImageNet classification by a large margin. Our method is also remarkable for consistently improving test performance with more training data and better network architectures. By fine-tuning the learned feature, we further obtain competitive results for semi-supervised learning and object detection tasks. Our non-parametric model is highly compact: With 128 features per image, our method requires only 600MB storage for a million images, enabling fast nearest neighbour retrieval at the run time.
更多
查看译文
关键词
unsupervised feature learning,neural net classifiers,supervised learning,nonparametric classification problem,noise-contrastive estimation,ImageNet classification,semisupervised learning,feature representation,object detection,nearest neighbour retrieval method
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要