Fast Rates for General Unbounded Loss Functions: From ERM to Generalized Bayes

JOURNAL OF MACHINE LEARNING RESEARCH(2020)

引用 79|浏览37
暂无评分
摘要
We present new excess risk bounds for general unbounded loss functions including log loss and squared loss, where the distribution of the losses may be heavy-tailed. The bounds hold for general estimators, but they are optimized when applied to eta-generalized Bayesian, MDL, and empirical risk minimization estimators. In the case of log loss, the bounds imply convergence rates for generalized Bayesian inference under misspecification in terms of a generalization of the Hellinger metric as long as the learning rate eta is set correctly. For general loss functions, our bounds rely on two separate conditions: the v -GRIP(generalized reversed information projection) conditions, which control the lower tail of the excess loss; and the newly introduced witness condition, which controls the upper tail. The parameter v in the v-GRIP conditions determines the achievable rate and is akin to the exponent in the Tsybakov margin condition and the Bernstein condition for bounded losses, which the v-GRIP conditions generalize; favorable v in combination with small model complexity leads to (O) over tilde (1/n) rates. The witness condition allows us to connect the excess risk to an "annealed" version thereof, by which we generalize several previous results connecting Hellinger and Renyi divergence to KL divergence.
更多
查看译文
关键词
Statistical Learning Theory,Fast Rates,PAC-Bayes,Misspecification,Generalized Bayes
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要