A Comparison between Recursive Neural Networks and Graph Neural Networks

IJCNN(2006)

引用 48|浏览80
暂无评分
摘要
Recursive Neural Networks (RNNs) and Graph Neural Networks (GNNs) are two connectionist models that can directly process graphs. RNNs and GNNs exploit a similar processing framework, but they can be applied to different input domains. RNNs require the input graphs to be directed and acyclic, whereas GNNs can process any kind of graphs. The aim of this paper consists in understanding whether such a dif- ference affects the behaviour of the models on a real application. An experimental comparison on an image classification problem is presented, showing that GNNs outperforms RNNs. Moreover the main differences between the models are also discussed w.r.t. their input domains, their approximation capabilities and their learning algorithms. in neural network models it is automatically learned by examples. Finally, SOMs-SD differ from the other methods since they implement an unsupervised learning framework, instead of a supervised one. Graph Neural Networks have been recently proposed to process very general types of graphs and can be considered an extension of RNNs. Actually, RNNs require input graphs to be directed and acyclic, while cyclic or non-directed structures must undergo a preprocessing phase. However, GNNs have not been widely tested yet, and it is unknown whether the performance of GNNs and RNNs is different in practical applications. This paper presents an experimental comparison between GNNs and RNNs. The two models are evaluated on a real-world computer vision problem, which consists in classifying a set of images. Moreover, the theoretical differences between the two models are also investigated, paying attention to their admitted input do- mains, their approximation capabilities and their learning algorithms.
更多
查看译文
关键词
graph theory,neural nets,acyclic graph,connectionist models,directed graph,graph neural networks,recursive neural networks
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要