Learning Compact Embedding Layers via Differentiable Product Quantization
2019.
Abstract:
Embedding layers are commonly used to map discrete symbols into continuous embedding vectors that reflect their semantic meanings. Despite their effectiveness, the number of parameters in an embedding layer increases linearly with the number of symbols and poses a critical challenge on memory and storage constraints. In this work, we prop...More
Code:
Data:
Tags
Comments