The intriguing role of module criticality in the generalization of deep networks
ICLR, 2020.
EI
Weibo:
Abstract:
We study the phenomenon that some modules of deep neural networks (DNNs) are more critical than others. Meaning that rewinding their parameter values back to initialization, while keeping other modules fixed at the trained parameters, results in a large drop in the network's performance. Our analysis reveals interesting properties of the ...More
Tags
Comments