谷歌浏览器插件
订阅小程序
在清言上使用

Evaluation of Parameter Update Effects in Deep Semi-Supervised Learning Algorithms.

2020 IEEE 44TH ANNUAL COMPUTERS, SOFTWARE, AND APPLICATIONS CONFERENCE (COMPSAC 2020)(2020)

引用 1|浏览3
暂无评分
摘要
Semi-Supervised Machine Learning (SSML) algorithms are combinations of unsupervised and supervised learning algorithms. This combination enables SSML algorithms to learn from both labelled and unlabelled data. One of the challenges is identifying the key contributing factors from both kinds of algorithms to the learning performance, in terms of training time, training loss, and accuracy. Previously, researchers have adopted Deep Neural Networks (DNNs) to construct the core learning models of SSML algorithms with improved accuracy. However, there is still a lacks of a systematic study to understand the key contributing factors and their effects. In this paper, we generalize the common components of SSML algorithms from state-of-the-art models (- Model, Temporal Ensembling and Mean-Teacher). We form a conceptual Semi-Supervised Computation Graph (SSCG) to inject different kinds of DNNs to the network classifier component in the computation graph. Such a combination illustrates two major aspects to investigate the effects: (1) parameter updates during the training across labelled and unlabelled data; (2) the ratio of labelled and unlabelled data. We performed 27 experiments with 3 SSML algorithms, 3 DNNs and 3 different ratios of labelled and unlabelled data. Our experimental results demonstrate that parameter updates are a dominating factor to the training loss and the learning precision. The experiments show that training loss is lowered by 6% and precision is increased by 4.21% using shake-shake26 as the network classifier in the SSML algorithm of Mean-Teacher, compared to all other combinations. We also observed a positive correlation with an R score value of 0.69 and the p-value of 0.03887 between the training time and the ratio of labelled to unlabelled data. Introducing more labelled data leads to longer training time, which triggers more parameter updates in back-and forward-propagations.
更多
查看译文
关键词
Semi-Supervised Machine Learning, Convolution Neural Networks, Deep Learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要