Learning Anytime Predictions in Neural Networks via Adaptive Loss Balancing

national conference on artificial intelligence, 2019.

Cited by: 9|Bibtex|Views127
EI
Other Links: academic.microsoft.com|dblp.uni-trier.de|arxiv.org

Abstract:

This work considers the trade-off between accuracy and test-time computational cost of deep neural networks (DNNs) via emph{anytime} predictions from auxiliary predictions. Specifically, we optimize auxiliary losses jointly in an emph{adaptive} weighted sum, where the weights are inversely proportional to average of each loss. Intuitively...More

Code:

Data:

Full Text
Your rating :
0

 

Tags
Comments