Time-Space Lower Bounds for Two-Pass Learning.

Leibniz International Proceedings in Informatics(2019)

引用 16|浏览98
暂无评分
摘要
A line of recent works showed that for a large class of learning problems, any learning algorithm requires either super-linear memory size or a super-polynomial number of samples [11, 7, 12, 9, 2, 5]. For example, any algorithm for learning parities of size n requires either a memory of size Omega(n(2)) or an exponential number of samples [11]. All these works modeled the learner as a one-pass branching program, allowing only one pass over the stream of samples. In this work, we prove the first memory-samples lower bounds (with a super-linear lower bound on the memory size and super-polynomial lower bound on the number of samples) when the learner is allowed two passes over the stream of samples. For example, we prove that any two-pass algorithm for learning parities of size n requires either a memory of size Omega(n(1.5)) or at least 2(Omega(root n)) samples. More generally, a matrix M : A x X -> {-1, 1} corresponds to the following learning problem: An unknown element x is an element of X is chosen uniformly at random. A learner tries to learn x from a stream of samples, (a(1), b(1)), (a(2), b(2)) ... , where for every i, a(i) is an element of A is chosen uniformly at random and b(i) = M(a(i), x). Assume that k, l, r are such that any submatrix of M of at least 2(-k) . vertical bar A vertical bar rows and at least 2(-l) . vertical bar X vertical bar columns, has a bias of at most 2(-r). We show that any two-pass learning algorithm for the learning problem corresponding to M requires either a memory of size at least Omega(k . min{k, root l}), or at least 2(Omega(min{k,root l,r})) samples.
更多
查看译文
关键词
branching program,time-space tradeoffs,two-pass streaming,PAC learning,lower bounds
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要