Gaussian universal features, canonical correlations, and common information

2018 IEEE Information Theory Workshop (ITW)(2018)

引用 2|浏览40
暂无评分
摘要
We address the problem of optimal feature selection for a Gaussian vector pair in the weak dependence regime, when the inference task is not known in advance. In particular, we show that multiple formulations all yield the same solution, and correspond to the singular value decomposition (SVD) of the canonical correlation matrix. Our results reveal key connections between canonical correlation analysis (CCA), principal component analysis (PCA), the Gaussian information bottleneck, Wyner's common information, and the Ky Fan (nuclear) norms.
更多
查看译文
关键词
Gaussian universal features,canonical correlations,optimal feature selection,Gaussian vector pair,singular value decomposition,canonical correlation matrix,principal component analysis,Gaussian information bottleneck,Wyner common information
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要