Deep Generative Models that Solve PDEs: Distributed Computing for Training Large Data-Free Models
2020 IEEE/ACM Workshop on Machine Learning in High Performance Computing Environments (MLHPC) and Workshop on Artificial Intelligence and Machine Learning for Scientific Applications (AI4S)(2020)
摘要
Recent progress in scientific machine learning (SciML) has opened up the possibility of training novel neural network architectures that solve complex partial differential equations (PDEs). Several (nearly data free) approaches have been recently reported that successfully solve PDEs, with examples including deep feed forward networks, generative networks, and deep encoder-decoder networks. Howeve...
更多查看译文
关键词
Deep generative models,Distributed training,PDEs,Loss functions,Cloud vs HPC,Higher-order optimization
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络