谷歌浏览器插件
订阅小程序
在清言上使用

CSTS: Exploring Class-Specific and Task-Shared Embedding Representation for Few-Shot Learning

IEEE transactions on neural networks and learning systems(2024)

引用 0|浏览6
暂无评分
摘要
Few-shot learning (FSL) is a challenging yet promising technique that aims to discriminate objects based on a few labeled examples. Learning a high-quality feature representation is key with few-shot data, and many existing models attempt to extract general information from the sample or task levels. However, the common sample-level means of feature representation limits the models generalizability to different tasks, while task-level representation may lose class characteristics due to excessive information aggregation. In this article, we synchronize the class-specific and task-shared information from the class and task levels to obtain a better representation. Structure-based contrastive learning is introduced to obtain class-specific representations by increasing the interclass distance. A hierarchical class structure is constructed by clustering semantically similar classes using the idea of granular computing. When guided by a class structure, it is more difficult to distinguish samples in different classes that have similar characteristics than those with large interclass differences. To this end, structure-guided contrastive learning is introduced to study class-specific information. A hierarchical graph neural network is established to transfer task-shared information from coarse to fine. It hierarchically infers the target sample based on all samples in the task and yields a more general representation for FSL classification. Experiments on four benchmark datasets demonstrate the advantages of our model over several state-of-the-art models.
更多
查看译文
关键词
Few-shot learning (FSL),granular computing,hierarchical graph (HG) neural network,structure-guided contrastive learning (SCL)
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要