Self-Adaptive Embedding For Few-Shot Classification By Hierarchical Attention
2020 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO (ICME)(2020)
摘要
Few-shot classification aims to learn a model that can generalize well to new classes-that are unseen in the training phase-with a small number of labeled instances. Many existing approaches learn a shared embedding function across various tasks to measure the similarities between support (train) and query (test) samples. However, the embeddings generated by these approaches fail to take into account the feature importance of different instances and the feature correlation between support and query samples in each task. To tackle this problem, we propose a novel Self-Adaptive Embedding approach (SAE) by introducing a hierarchical attention scheme. The major novelty of SAE lies in two folds. First, SAE can effectively capture the most discriminative features at the instance level, which significantly improves its performance on downstream classification tasks. Second, SAE can adaptively adjust the representations of support and query samples by considering the feature structures shared by them at the task level. Experiments demonstrate that SAE significantly outperforms existing state-of-the-art methods.
更多查看译文
关键词
Few-shot learning, hierarchical attention, self-adaptive embedding
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络