Low-Rank Graph Regularized Sparse Coding

PRICAI 2018: TRENDS IN ARTIFICIAL INTELLIGENCE, PT I(2018)

引用 7|浏览26
暂无评分
摘要
In this paper, we propose a solution to the instability problem of sparse coding with the technique of low-rank representation (LRR) which is a promising method of discovering subspace structures of data. Graph regularized sparse coding has been extensively studied for keeping the locality of the high-dimensional observations. However, in practice, data is always corrupted by noises such that samples from the same class may not inhabit the nearest area. To this end, we present a novel method for robust sparse representation, dubbed low-rank graph regularized sparse coding (LogSC). LogSC uses LRR to capture the multiple subspace structures of the data and aims to preserve this structure into the resultant sparse codes. Different from the traditional methods, our method, jointly rather than separately, learns the sparse codes and the LRR; our method maintains the global structure of the data no longer the local structure. Thus, the yielding sparse codes can be not only robust to the corrupted samples thanks to the LRR, but also discriminative arising from the multiple subspaces preserving. The optimization problem of LogSC can be effectively tackled by the linearized alternating direction method with adaptive penalty. To evaluate our approach, we apply LogSC for image clustering and classification, and meanwhile probe it in noisy scenes. The inspiring experimental results on the public image data sets manifest the discrimination, the robustness and the usability of the proposed LogSC.
更多
查看译文
关键词
Laplacian sparse coding, Multiple subspaces preserving, Low-rank representation, Image clustering and classification
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要