基本信息
views: 223

Bio
My research vision is to develop machine learning (ML) methods for accelerating scientific simulation and discovery, while opening new frontiers in machine learning research (AI + Science). This lies in the interdisciplinary field of machine learning, scientific computing, and physical sciences.
Towards this goal, my past research has pioneered and made important advances to learning structured and compressed representations for accelerating large-scale and multi-scale simulations in physical sciences, including fluid, plasma, and more generic PDEs and N-body systems. My research has enabled ML-based surrogate models to scale to dynamical systems with two orders of magnitude higher dimensions
and 15x faster than prior ML models. The ML models I developed are being deployed for fluid simulation in industry and will also be used for modeling laser-plasma systems in Stanford National Accelerator Laboratory (SLAC). Besides ML for simulation, I have introduced ML methods for discovering symbolic theories (published in a top physics journal) and relational structures from observations, and have theoretically revealed the origin of phase transition phenomena for the compression vs. prediction tradeoff
in representation learning.
Towards this goal, my past research has pioneered and made important advances to learning structured and compressed representations for accelerating large-scale and multi-scale simulations in physical sciences, including fluid, plasma, and more generic PDEs and N-body systems. My research has enabled ML-based surrogate models to scale to dynamical systems with two orders of magnitude higher dimensions
and 15x faster than prior ML models. The ML models I developed are being deployed for fluid simulation in industry and will also be used for modeling laser-plasma systems in Stanford National Accelerator Laboratory (SLAC). Besides ML for simulation, I have introduced ML methods for discovering symbolic theories (published in a top physics journal) and relational structures from observations, and have theoretically revealed the origin of phase transition phenomena for the compression vs. prediction tradeoff
in representation learning.
Education
Sign in to view more
Experience
Sign in to view more
Research Interests
Papers共 42 篇Author StatisticsCo-AuthorSimilar Experts
By YearBy Citation主题筛选期刊级别筛选合作者筛选合作机构筛选
时间
引用量
主题
期刊级别
合作者
合作机构
Peiyan Hu,Rui Wang,Xiang Zheng, Tao Zhang,Haodong Feng, Ruiqi Feng, Long Wei,Yue Wang,Zhi-Ming Ma,Tailin Wu
Cited0Views0EIBibtex
0
0
CoRR (2025)
Cited0Views0EIBibtex
0
0
CoRR (2025)
Cited0Views0EIBibtex
0
0
Cited0Views0EIBibtex
0
0
Cited0Views0Bibtex
0
0
Peiyan Hu, Xiaowei Qian, Wenhao Deng,Rui Wang,Haodong Feng, Ruiqi Feng, Tao Zhang,Long Wei, Yue Wang,Zhi-Ming Ma,Tailin Wu
CoRR (2025)
Cited0Views0EIBibtex
0
0
ICLR 2024 (2024)
Haixin Wang,Yadi Cao,Zijie Huang,Yuxuan Liu,Peiyan Hu,Xiao Luo, Zezheng Song,Wanjia Zhao, Jilin Liu,Jinan Sun,Shikun Zhang, Long Wei,Yue Wang,Tailin Wu,Zhi-Ming Ma,Yizhou Sun
CoRR (2024)
Cited0Views0EIBibtex
0
0
Load More
Author Statistics
#Papers: 42
#Citation: 1061
H-Index: 15
G-Index: 25
Sociability: 5
Diversity: 1
Activity: 13
Loading...
Co-Author
Co-Institution
D-Core
- 合作者
- 学生
- 导师
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn