Joint Dual-Structural Constrained and Non-negative Analysis Representation Learning for Pattern Classification

APPLIED ARTIFICIAL INTELLIGENCE(2023)

引用 0|浏览2
暂无评分
摘要
In recent years, analysis dictionary learning (ADL) model has attracted much attention from researchers, owing to its scalability and efficiency in representation-based classification. Despite the supervised label information embedding, the classification performance of analysis representation suffers from the redundant and noisy samples in real-world datasets. In this paper, we propose a joint Dual-Structural constrained and Non-negative Analysis Representation (DSNAR) learning model. First, the supervised latent structural transformation term is considered implicitly to generate a roughly block diagonal representation for intra-class samples. However, this discriminative structure is fragile and weak in the presence of noisy and redundant samples. To highlight both intra-class similarity and inter-class separation for class-oriented representation, we then explicitly incorporate an off-block suppressing term on the ADL model, together with a non-negative representation constraint, to achieve a well-structured and meaningful interpretation of the contributions from all class-oriented atoms. Moreover, a robust classification scheme in latent space is proposed to avoid accidental incorrect predictions with noisy information. Finally, the DSNAR model is alternatively solved by the K-SVD method, iterative re-weighted method and gradient method efficiently. Extensive classification results on five benchmark datasets validate the performance superiority of our DSNAR model compared to other state-of-the-art DL models.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要