Dropout: a simple way to prevent neural networks from overfittingEI

摘要

Deep neural nets with a large number of parameters are very powerful machine learning systems. However, overfitting is a serious problem in such networks. Large networks are also slow to use, making it difficult to deal with overfitting by combining the predictions of many different large neural nets at test time. Dropout is a technique for addressing this problem. The key idea is to randomly drop units (along with their connections) from the neural network during training. This prevents units from co-adapting too much. During trainin...更多

全文

个人信息

 

您的评分 :

Journal of Machine Learning Research, pp. 1929-1958, 2014.

被引用次数12358|引用|438
标签
作者
评论