MASP: Model-Agnostic Sample Propagation for Few-shot learning


引用 0|浏览86
Few-shot learning aims to train a classifier given only a few samples per class that are highly insufficient to describe the whole data distribution. These few-shot samples not only introduce high variance to the training but also may include outliers near the class boundaries. Directly feeding these samples to training algorithms can lead to unstable optimization and even incorrect gradient descent direction. In this paper, we improve the robustness to ``outliers'' by learning to propagate and refine the representations of few-shot samples to form a more compact data distribution before using them to train a classifier. We develop a mutual calibration among few-shot samples' representations by graph propagation, for which we learn an attention mechanism to build the graph and determine the propagation weights. On both clean datasets and datasets containing noisy labels, we show that our sample propagation generally improves different types of existing few-shot learning methods in multiple few-shot learning settings.
Gradient descent,Outlier,Robustness (computer science),Classifier (linguistics),Pattern recognition,Computer science,Calibration,Artificial intelligence,Graph,Learning methods,Outlier removal
AI 理解论文