Uncertainty Quantification of Graph Convolution Neural Network Models of Evolving Processes
CoRR(2024)
摘要
The application of neural network models to scientific machine learning tasks
has proliferated in recent years. In particular, neural network models have
proved to be adept at modeling processes with spatial-temporal complexity.
Nevertheless, these highly parameterized models have garnered skepticism in
their ability to produce outputs with quantified error bounds over the regimes
of interest. Hence there is a need to find uncertainty quantification methods
that are suitable for neural networks. In this work we present comparisons of
the parametric uncertainty quantification of neural networks modeling complex
spatial-temporal processes with Hamiltonian Monte Carlo and Stein variational
gradient descent and its projected variant. Specifically we apply these methods
to graph convolutional neural network models of evolving systems modeled with
recurrent neural network and neural ordinary differential equations
architectures. We show that Stein variational inference is a viable alternative
to Monte Carlo methods with some clear advantages for complex neural network
models. For our exemplars, Stein variational interference gave similar
uncertainty profiles through time compared to Hamiltonian Monte Carlo, albeit
with generally more generous variance.Projected Stein variational gradient
descent also produced similar uncertainty profiles to the non-projected
counterpart, but large reductions in the active weight space were confounded by
the stability of the neural network predictions and the convoluted likelihood
landscape.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要