On the Benefits of Multiple Gossip Steps in Communication-Constrained Decentralized Optimization

Abolfazl Hashemi
Abolfazl Hashemi
Anish Acharya
Anish Acharya
Rudrajit Das
Rudrajit Das
Cited by: 0|Bibtex|Views4
Other Links: arxiv.org

Abstract:

In decentralized optimization, it is common algorithmic practice to have nodes interleave (local) gradient descent iterations with gossip (i.e. averaging over the network) steps. Motivated by the training of large-scale machine learning models, it is also increasingly common to require that messages be {\em lossy compressed} versions of...More

Code:

Data:

Your rating :
0

 

Tags
Comments