Fast Rates of ERM and Stochastic Approximation: Adaptive to Error Bound Conditions.

ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018)(2018)

引用 24|浏览63
暂无评分
摘要
Error bound conditions (EBC) are properties that characterize the growth of an objective function when a point is moved away from the optimal set. They have recently received increasing attention for developing optimization algorithms with fast convergence. However, the studies of EBC in statistical learning are hitherto still limited. The main contributions of this paper are two-fold. First, we develop fast and intermediate rates of empirical risk minimization (ERM) under EBC for risk minimization with Lipschitz continuous, and smooth convex random functions. Second, we establish fast and intermediate rates of an efficient stochastic approximation (SA) algorithm for risk minimization with Lipschitz continuous random functions, which requires only one pass of n samples and adapts to EBC. For both approaches, the convergence rates span a full spectrum between (O) over tilde (1/root n ) and (O) over tilde (1/n) depending on the power constant in EBC, and could be even faster than O (1/n) in special cases for ERM. Moreover, these convergence rates are automatically adaptive without using any knowledge of EBC.
更多
查看译文
关键词
empirical risk minimization,objective function,stochastic approximation,full spectrum,special cases,statistical learning,lipschitz continuous
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要