Information Dropout: Learning Optimal Representations Through Noisy Computation.

IEEE Transactions on Pattern Analysis and Machine Intelligence(2018)

引用 433|浏览85
暂无评分
摘要
The cross-entropy loss commonly used in deep learning is closely related to the defining properties of optimal representations, but does not enforce some of the key properties. We show that this can be solved by adding a regularization term, which is in turn related to injecting multiplicative noise in the activations of a Deep Neural Network, a special case of which is the common practice of drop...
更多
查看译文
关键词
Neural networks,Deep learning,Bayes methods,Machine learning,Information theory,Noise measurement,Learning systems
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要