Localization of VC Classes: Beyond Local Rademacher Complexities.

Theoretical Computer Science(2018)

引用 20|浏览57
暂无评分
摘要
In statistical learning the excess risk of empirical risk minimization (ERM) is controlled by (COMPn(F)n)α, where n is a size of a learning sample, COMPn(F) is a complexity term associated with a given class F and α∈[12,1] interpolates between slow and fast learning rates. In this paper we introduce an alternative localization approach for binary classification that leads to a novel complexity measure: fixed points of the local empirical entropy. We show that this complexity measure gives a tight control over COMPn(F) in the upper bounds under bounded noise. Our results are accompanied by a minimax lower bound that involves the same quantity. In particular, we practically answer the question of optimality of ERM under bounded noise for general VC classes.
更多
查看译文
关键词
Statistical learning,PAC learning,Local metric entropy,Local Rademacher process,Shifted empirical process,Offset Rademacher process,ERM,Alexander's capacity,Disagreement coefficient,Massart's noise condition
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要