Learning an Invariant Hilbert Space for Domain Adaptation

2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)(2017)

引用 129|浏览56
暂无评分
摘要
This paper introduces a learning scheme to construct a Hilbert space (i.e., a vector space along its inner product) to address both unsupervised and semi-supervised domain adaptation problems. This is achieved by learning projections from each domain to a latent space along the Mahalanobis metric of the latent space to simultaneously minimizing a notion of domain variance while maximizing a measure of discriminatory power. In particular, we make use of the Riemannian optimization techniques to match statistical properties (e.g., first and second order statistics) between samples projected into the latent space from different domains. Upon availability of class labels, we further deem samples sharing the same label to form more compact clusters while pulling away samples coming from different classes.We extensively evaluate and contrast our proposal against state-of-the-art methods for the task of visual domain adaptation using both handcrafted and deep-net features. Our experiments show that even with a simple nearest neighbor classifier, the proposed method can outperform several state-of-the-art methods benefitting from more involved classification schemes.
更多
查看译文
关键词
invariant Hilbert space,learning scheme,vector space,semisupervised domain adaptation problems,latent space,domain variance,visual domain adaptation,unsupervised domain adaptation problems,Riemannian optimization techniques
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要