Volume-preserving Recurrent Neural Networks (VPRNN)

William Taylor-Melanson,Gordon MacDonald,Andrew Godbout

2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN)(2021)

引用 1|浏览5
暂无评分
摘要
Volume-preserving neural networks (VPNN) are neural networks comprised solely of volume-preserving transformations, with the possible exception of the output layer. In this paper, we present a new type of recurrent neural network (RNN) for sequence processing that utilizes parametrized orthogonal matrices inspired by those transformations used in VPNNs, named the volume-preserving recurrent neural network (VPRNN) after its feed-forward predecessor. Our VPRNN models show promise on the classical addition problem for RNNs, successfully processing sequences of length 10,000. We also present a theoretical result for a matrix norm-based generalization gap of VPRNN classifiers using PAC-Bayesian analysis which is sublinear in the length of sequences being processed. This generalization gap provides a theoretical improvement when compared to typical RNNs. We find that our single-cell VPRNN models improve upon previously proposed unitary and orthogonal RNN architectures (with similar parameter counts) on the sequential and permuted pixel MNIST classification tasks, and improve over gated baselines (using far fewer parameters) on the sequential IMDB classification task. Further, we provide comparisons with several unitary and orthogonal RNNs on the HAR-2 classification task, revealing the possibility of deploying VPRNNs on memory-constrained systems.
更多
查看译文
关键词
single-cell VPRNN models,volume-preserving recurrent neural network,volume-preserving transformations,sequence processing,orthogonal matrices,feed-forward predecessor,matrix norm-based generalization gap,PAC-Bayesian analysis,permuted pixel MNIST classification tasks,gated baselines,sequential IMDB classification task,HAR-2 classification task,memory-constrained systems
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要