Subspace learning via Locally Constrained A-optimal nonnegative projection.

Neurocomputing(2013)

引用 17|浏览138
暂无评分
摘要
For decades, subspace learning has received considerable interests in the pattern recognition and computer vision communities. Many promising methods have emerged to capture a better subspace from different perspectives. As a popular learning paradigm, matrix factorization is actively utilized to learn a new subspace from high-dimensional data space. Very recently, some work attempts to consider the decomposed matrix from a statistical point of view, which models the data points via ridge regression and minimizes the variance of the parameter. However, they neglect the structured information embedded in the local neighborhoods of each data point and fail to exploit the prior knowledge. To address these problems, we present a novel subspace learning approach named Locally Constrained A-optimal nonnegative projection, termed as LCA in short. This method strives to preserve the locally geometrical structure of the obtained subspace via neighborhood patches while projecting the nonnegative data points with the high dimension onto a low-dimensional subspace. Besides, we incorporate some supervised information as constraints to guide subspace learning, such that the discriminating power of the new subspace can be much more strengthened. Therefore, the column vectors derived from the nonnegative projection span a new subspace that characterizes local consistency and better discriminative ability. The favorable experimental results have verified the effectiveness of the proposed approach compared to some competitive methods.
更多
查看译文
关键词
Subspace clustering,Nonnegative projection,Semi-supervised learning,Structured information,Label constraints
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要