Robust Gradient Descent Via Moment Encoding And Ldpc Codes
2019 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT)(2019)
摘要
This paper considers the problem of implementing large-scale gradient descent algorithms in a distributed computing setting in the presence of straggling processors. To mitigate the effect of the stragglers, it has been previously proposed to encode the data with an erasure-correcting code and decode at the master server at the end of the computation. We, instead, propose to encode the second-moment of the data with a low density parity-check (LDPC) code. The iterative decoding algorithms for LDPC codes have very low computational overhead and the number of decoding iterations can be made to automatically adjust with the number of stragglers in the system. For a random model for stragglers, we obtain the convergence guarantees for the proposed solution by viewing it as the stochastic gradient descent method. Furthermore, the proposed solution outperforms the existing schemes in a real distributed computing setup.
更多查看译文
关键词
master server,low density parity-check code,LDPC codes,low computational overhead,decoding iterations,stochastic gradient descent method,distributed computing setup,robust gradient descent,moment encoding,large-scale gradient descent algorithms,distributed computing setting,straggling processors,erasure-correcting code,iterative decoding algorithms,convergence guarantees
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络