Using downhill simplex method for optimizing machine learning training running time

semanticscholar(2016)

引用 1|浏览0
暂无评分
摘要
Many modern machine learning algorithms rely on a set of configuration parameters, such that a proper setting of their values influences the accuracy and the running time of the algorithm. We propose an automated approach based on the Downhill Simplex optimization method for calculating the optimal parameter set in terms of training time. We demonstrated 5X-30X speedups for training the text analytics algorithm word2vec, just through using better configuration parameters.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要