The neighborhood lattice for encoding partial correlations in a Hilbert space.

arXiv: Statistics Theory(2019)

引用 23|浏览111
暂无评分
摘要
Neighborhood regression has been a successful approach in graphical and structural equation modeling, with applications to learning undirected and directed graphical models. We extend these ideas by defining and studying an algebraic structure called the neighborhood lattice based on a generalized notion of neighborhood regression. We show that this algebraic structure has the potential to provide an economic encoding of all conditional independence statements in a Gaussian distribution (or conditional uncorrelatedness in general), even in the cases where no graphical model exists that could perfectly encode all such statements. We study the computational complexity of computing these structures and show that under a sparsity assumption, they can be computed in polynomial time, even in the absence of the assumption of perfectness to a graph. On the other hand, assuming perfectness, we show how these neighborhood lattices may be graphically computed using the separation properties of the so-called partial correlation graph. We also draw connections with directed acyclic graphical models and Bayesian networks. We derive these results using an abstract generalization of partial uncorrelatedness, called partial orthogonality, which allows us to use algebraic properties of projection operators on Hilbert spaces to significantly simplify and extend existing ideas and arguments. Consequently, our results apply to a wide range of random objects and data structures, such as random vectors, data matrices, and functions.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要