Sparse Spatio-Temporal Neural Network for Large-Scale Forecasting.

Big Data(2022)

引用 0|浏览8
暂无评分
摘要
We introduce sSTNN, a sparse and parallelized version of a spatio-temporal neural network (STNN) that enables training on much larger datasets. First, we introduce the model architecture and discuss the modifications we made to enable the use of a sparse data structure and multi-GPU parallelization. Then we present empirical results that demonstrate sSTNNs ability to train and inference on a dataset 17 times larger than STNN is capable of. Finally, we discuss the effect of sparsification on runtime and present evidence that sSTNN can achieve upwards of 117× reduction in memory usage compared to STNN.
更多
查看译文
关键词
neural network,large-scale large-scale,spatio-temporal
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要