SAdam: A Variant of Adam for Strongly Convex Functions

ICLR, 2020.

Cited by: 0|Bibtex|Views63
EI
Other Links: academic.microsoft.com|dblp.uni-trier.de

Abstract:

The Adam algorithm has become extremely popular for large-scale machine learning. Under convexity condition, it has been proved to enjoy a data-dependent $O(\sqrt{T})$ regret bound where $T$ is the time horizon. However, whether strong convexity can be utilized to further improve the performance remains an open problem. In this paper, we ...More

Code:

Data:

Full Text
Your rating :
0

 

Tags
Comments