Amp Chain Graphs: Minimal Separators And Structure Learning Algorithms

JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH(2020)

引用 7|浏览24
暂无评分
摘要
This paper deals with chain graphs (CGs) under the Andersson-Madigan-Perlman (AMP) interpretation. We address the problem of finding a minimal separator in an AMP CG, namely, finding a set Z of nodes that separates a given non-adjacent pair of nodes such that no proper subset of Z separates that pair. We analyze several versions of this problem and offer polynomial time algorithms for each. These include finding a minimal separator from a restricted set of nodes, finding a minimal separator for two given disjoint sets, and testing whether a given separator is minimal. To address the problem of learning the structure of AMP CGs from data, we show that the PC-LIKE algorithm is order-dependent, in the sense that the output can depend on the order in which the variables are given. We propose several modifications of the PC-LIKE algorithm that remove part or all of this order-dependence. We also extend the decomposition-based approach for learning Bayesian networks (BNs) to learn AMP CGs, which include BNs as a special case, under the faithfulness assumption. We prove the correctness of our extension using the minimal separator results. Using standard benchmarks and synthetically generated models and data in our experiments demonstrate the competitive performance of our decomposition-based method, called LCD-AMP, in comparison with the (modified versions of) PC-LIKE algorithm. The LCD-AMP algorithm usually outperforms the PC-LIKE algorithm, and our modifications of the PC-LIKE algorithm learn structures that are more similar to the underlying ground truth graphs than the original PC-LIKE algorithm, especially in high-dimensional settings. In particular, we empirically show that the results of both algorithms are more accurate and stabler when the sample size is reasonably large and the underlying graph is sparse.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要