首页    期刊浏览 2024年12月02日 星期一
登录注册

文章基本信息

  • 标题:Time-Space Lower Bounds for Two-Pass Learning
  • 本地全文:下载
  • 作者:Sumegha Garg ; Ran Raz ; Avishay Tal
  • 期刊名称:Electronic Colloquium on Computational Complexity
  • 印刷版ISSN:1433-8092
  • 出版年度:2019
  • 卷号:2019
  • 页码:1-42
  • 出版社:Universität Trier, Lehrstuhl für Theoretische Computer-Forschung
  • 摘要:

    A line of recent works showed that for a large class of learning problems, any learning algorithm requires either super-linear memory size or a super-polynomial number of samples [Raz16,KRT17,Raz17,MM18,BOGY18,GRT18]. For example, any algorithm for learning parities of size n requires either a memory of size ( n 2 ) or an exponential number of samples [Raz16].

    All these works modeled the learner as a one-pass branching program, allowing only one pass over the stream of samples. In this work, we prove the first memory-samples lower bounds (with a super-linear lower bound on the memory size and super-polynomial lower bound on the number of samples) when the learner is allowed two passes over the stream of samples. For example, we prove that any two-pass algorithm for learning parities of size n requires either a memory of size ( n 1 5 ) or at least 2 ( n ) samples.

    More generally, a matrix M : A X − 1 1 corresponds to the following learning problem: An unknown element x X is chosen uniformly at random. A learner tries to learn x from a stream of samples, ( a 1 b 1 ) ( a 2 b 2 ) , where for every i , a i A is chosen uniformly at random and b i = M ( a i x ) .

    Assume that k l r are such that any submatrix of M of at least 2 − k A rows and at least 2 − l X columns, has a bias of at most 2 − r . We show that any two-pass learning algorithm for the learning problem corresponding to M requires either a memory of size at least k min k l , or at least 2 ( min k l r ) samples.

  • 关键词:branching program ; lower bounds ; PAC learning ; time;space tradeoff ; two;pass streaming
国家哲学社会科学文献中心版权所有