DessiLBI: Exploring Structural Sparsity of Deep Networks via Differential Inclusion Paths
ICML, pp. 3315-3326, 2020.
Over-parameterization is ubiquitous nowadays in training neural networks to benefit both optimization in seeking global optima and generalization in reducing prediction error. However, compressive networks are desired in many real world applications and direct training of small networks may be trapped in local optima. In this paper, ins...More
PPT (Upload PPT)