Regularizing Neural Networks via Stochastic Branch Layers
ACML, pp. 678-693, 2019.
EI
Abstract:
We introduce a novel stochastic regularization technique for deep neural networks, which decomposes a layer into multiple branches with different parameters and merges stochastically sampled combinations of the outputs from the branches during training. Since the factorized branches can collapse into a single branch through a linear ope...More
Code:
Data:
Tags
Comments