No Multiplication? No Floating Point? No Problem! Training Networks for Efficient Inference

arXiv: Learning, Volume abs/1809.09244, 2018.

Cited by: 0|Bibtex|Views32
EI
Other Links: dblp.uni-trier.de|academic.microsoft.com|arxiv.org

Abstract:

For successful deployment of deep neural networks on highly--resource-constrained devices (hearing aids, earbuds, wearables), we must simplify the types of operations and the memory/power resources used during inference. Completely avoiding inference-time floating-point operations is one of the simplest ways to design networks for these h...More

Code:

Data:

Your rating :
0

 

Tags
Comments