Chrome Extension
WeChat Mini Program
Use on ChatGLM

Low-rank and sparse embedding for dimensionality reduction

Neural networks : the official journal of the International Neural Network Society(2018)

Cited 24|Views80
No score
Abstract
In this paper, we propose a robust subspace learning (SL) framework for dimensionality reduction which further extends the existing SL methods to a low-rank and sparse embedding (LRSE) framework from three aspects: overall optimum, robustness and generalization. Owing to the uses of low-rank and sparse constraints, both the global subspaces and local geometric structures of data are captured by the reconstruction coefficient matrix and at the same time the low-dimensional embedding of data are enforced to respect the low-rankness and sparsity. In this way, the reconstruction coefficient matrix learning and SL are jointly performed, which can guarantee an overall optimum. Moreover, we adopt a sparse matrix to model the noise which makes LRSE robust to the different types of noise. The combination of global subspaces and local geometric structures brings better generalization for LRSE than related methods, i.e., LRSE performs better than conventional SL methods in unsupervised and supervised scenarios, particularly in unsupervised scenario the improvement of classification accuracy is considerable. Seven specific SL methods including unsupervised and supervised methods can be derived from the proposed framework and the experiments on different data sets (including corrupted data) demonstrate the superiority of these methods over the existing, well-established SL methods. Further, we exploit experiments to provide some new insights for SL.
More
Translated text
Key words
Dimensionality reduction,Subspace learning,Robustness,Overall optimum
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined