No Multiplication? No Floating Point? No Problem! Training Networks for Efficient Inference
arXiv: Learning, Volume abs/1809.09244, 2018.
For successful deployment of deep neural networks on highly--resource-constrained devices (hearing aids, earbuds, wearables), we must simplify the types of operations and the memory/power resources used during inference. Completely avoiding inference-time floating-point operations is one of the simplest ways to design networks for these h...More
Full Text (Upload PDF)
PPT (Upload PPT)