PINNsFormer: A Transformer-Based Framework For Physics-Informed Neural Networks
arxiv(2023)
摘要
Physics-Informed Neural Networks (PINNs) have emerged as a promising deep
learning framework for approximating numerical solutions to partial
differential equations (PDEs). However, conventional PINNs, relying on
multilayer perceptrons (MLP), neglect the crucial temporal dependencies
inherent in practical physics systems and thus fail to propagate the initial
condition constraints globally and accurately capture the true solutions under
various scenarios. In this paper, we introduce a novel Transformer-based
framework, termed PINNsFormer, designed to address this limitation. PINNsFormer
can accurately approximate PDE solutions by utilizing multi-head attention
mechanisms to capture temporal dependencies. PINNsFormer transforms point-wise
inputs into pseudo sequences and replaces point-wise PINNs loss with a
sequential loss. Additionally, it incorporates a novel activation function,
Wavelet, which anticipates Fourier decomposition through deep neural networks.
Empirical results demonstrate that PINNsFormer achieves superior generalization
ability and accuracy across various scenarios, including PINNs failure modes
and high-dimensional PDEs. Moreover, PINNsFormer offers flexibility in
integrating existing learning schemes for PINNs, further enhancing its
performance.
更多查看译文
AI 理解论文
溯源树
样例
![](https://originalfileserver.aminer.cn/sys/aminer/pubs/mrt_preview.jpeg)
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要