A Novel Knowledge Distillation Method for Self-Supervised Hyperspectral Image Classification

REMOTE SENSING(2022)

引用 3|浏览5
暂无评分
摘要
Using deep learning to classify hyperspectral image(HSI) with only a few labeled samples available is a challenge. Recently, the knowledge distillation method based on soft label generation has been used to solve classification problems with a limited number of samples. Unlike normal labels, soft labels are considered the probability of a sample belonging to a certain category, and are therefore more informative for the sake of classification. The existing soft label generation methods for HSI classification cannot fully exploit the information of existing unlabeled samples. To solve this problem, we propose a novel self-supervised learning method with knowledge distillation for HSI classification, termed SSKD. The main motivation is to exploit more valuable information for classification by adaptively generating soft labels for unlabeled samples. First, similarity discrimination is performed using all unlabeled and labeled samples by considering both spatial distance and spectral distance. Then, an adaptive nearest neighbor matching strategy is performed for the generated data. Finally, probabilistic judgment for the category is performed to generate soft labels. Compared to the state-of-the-art method, our method improves the classification accuracy by 4.88%, 7.09% and 4.96% on three publicly available datasets, respectively.
更多
查看译文
关键词
soft labeling,deep learning,knowledge distillation,self-supervised learning,hyperspectral image classification
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要