On model misspecification and KL separation for Gaussian graphical models

2015 IEEE International Symposium on Information Theory (ISIT)(2015)

引用 6|浏览27
暂无评分
摘要
We establish bounds on the KL divergence between two multivariate Gaussian distributions in terms of the Hamming distance between the edge sets of the corresponding graphical models. We show that the KL divergence is bounded below by a constant when the graphs differ by at least one edge; this is essentially the tightest possible bound, since classes of graphs exist for which the edge discrepancy increases but the KL divergence remains bounded above by a constant. As a natural corollary to our KL lower bound, we also establish a sample size requirement for correct model selection via maximum likelihood estimation. Our results rigorize the notion that it is essential to estimate the edge structure of a Gaussian graphical model accurately in order to approximate the true distribution to close precision.
更多
查看译文
关键词
model misspecification,KL separation,Gaussian graphical models,KL divergence,multivariate Gaussian distributions,Hamming distance,edge discrepancy,maximum likelihood estimation,edge structure
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要