Passel: Improved Scalability and Efficiency of Distributed SVM using a Cacheless PGAS Migrating Thread Architecture

2021 12th Workshop on Latest Advances in Scalable Algorithms for Large-Scale Systems (ScalA)(2021)

引用 1|浏览6
暂无评分
摘要
Stochastic Gradient Descent (SGD) is a valuable algorithm for large-scale machine learning, but has proven difficult to parallelize on conventional architectures because of communication and memory access issues. The HogWild series of mixed logically distributed and physically multi-threaded algorithms overcomes these issues for problems with sparse characteristics by using multiple local model ve...
更多
查看译文
关键词
Support vector machines,Machine learning algorithms,Instruction sets,Scalability,Memory management,Stochastic processes,Computer architecture
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要