Large-step neural network for learning the symplectic evolution from partitioned data

arXiv (Cornell University)(2023)

引用 0|浏览0
暂无评分
摘要
ABSTRACT In this study, we focus on learning Hamiltonian systems, which involves predicting the coordinate ($\boldsymbol q$) and momentum ($\boldsymbol p$) variables generated by a symplectic mapping. Based on Chen & Tao (2021), the symplectic mapping is represented by a generating function. To extend the prediction time period, we develop a new learning scheme by splitting the time series ($\boldsymbol q_i$, $\boldsymbol p_i$) into several partitions. We then train a large-step neural network (LSNN) to approximate the generating function between the first partition (i.e. the initial condition) and each one of the remaining partitions. This partition approach makes our LSNN effectively suppress the accumulative error when predicting the system evolution. Then we train the LSNN to learn the motions of the 2:3 resonant Kuiper belt objects for a long time period of 25 000 yr. The results show that there are two significant improvements over the neural network constructed in our previous work: (1) the conservation of the Jacobi integral and (2) the highly accurate predictions of the orbital evolution. Overall, we propose that the designed LSNN has the potential to considerably improve predictions of the long-term evolution of more general Hamiltonian systems.
更多
查看译文
关键词
symplectic evolution,neural network,learning,large-step
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要