Dropout: Explicit Forms And Capacity Control

INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139(2021)

引用 30|浏览114
暂无评分
摘要
We investigate the capacity control provided by dropout in various machine learning problems. First, we study dropout for matrix completion, where it induces a distribution-dependent regularizer that equals the weighted trace-norm of the product of the factors. In deep learning, we show that the distribution-dependent regularizer due to dropout directly controls the Rademacher complexity of the underlying class of deep neural networks. These developments enable us to give concrete generalization error bounds for the dropout algorithm in both matrix completion as well as training deep neural networks.
更多
查看译文
关键词
dropout,capacity,explicit forms,control
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要