Hierarchical Estimation for Effective and Efficient Sampling Graph Neural Network

Yang Li, Bin Xu, Qing Cao,Yige Yuan,Huawei Shen

arXiv (Cornell University)(2022)

引用 0|浏览0
暂无评分
摘要
Improving the scalability of GNNs is critical for large graphs. Existing methods leverage three sampling paradigms including node-wise, layer-wise and subgraph sampling, then design unbiased estimator for scalability. However, the high variance still severely hinders GNNs' performance. On account that previous studies either lacks variance analysis or only focus on a particular sampling paradigm, we firstly propose an unified node sampling variance analysis framework and analyze the core challenge "circular dependency" for deriving the minimum variance sampler, i. e., sampling probability depends on node embeddings while node embeddings can not be calculated until sampling is finished. Existing studies either ignore the node embeddings or introduce external parameters, resulting in the lack of a both efficient and effective variance reduction methods. Therefore, we propose the \textbf{H}ierarchical \textbf{E}stimation based \textbf{S}ampling GNN (HE-SGNN) with first level estimating the node embeddings in sampling probability to break circular dependency, and second level employing sampling GNN operator to estimate the nodes' representations on the entire graph. Considering the technical difference, we propose different first level estimator, i.e., a time series simulation for layer-wise sampling and a feature based simulation for subgraph sampling. The experimental results on seven representative datasets demonstrate the effectiveness and efficiency of our method.
更多
查看译文
关键词
efficient sampling graph
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要