首页    期刊浏览 2025年03月03日 星期一
登录注册

文章基本信息

  • 标题:Fundamental bounds on learning performance in neural circuits
  • 本地全文:下载
  • 作者:Dhruva Venkita Raman ; Adriana Perez Rotondo ; Timothy O’Leary
  • 期刊名称:Proceedings of the National Academy of Sciences
  • 印刷版ISSN:0027-8424
  • 电子版ISSN:1091-6490
  • 出版年度:2019
  • 卷号:116
  • 期号:21
  • 页码:10537-10546
  • DOI:10.1073/pnas.1813416116
  • 出版社:The National Academy of Sciences of the United States of America
  • 摘要:How does the size of a neural circuit influence its learning performance? Larger brains tend to be found in species with higher cognitive function and learning ability. Intuitively, we expect the learning capacity of a neural circuit to grow with the number of neurons and synapses. We show how adding apparently redundant neurons and connections to a network can make a task more learnable. Consequently, large neural circuits can either devote connectivity to generating complex behaviors or exploit this connectivity to achieve faster and more precise learning of simpler behaviors. However, we show that in a biologically relevant setting where synapses introduce an unavoidable amount of noise, there is an optimal size of network for a given task. Above the optimal network size, the addition of neurons and synaptic connections starts to impede learning performance. This suggests that the size of brain circuits may be constrained by the need to learn efficiently with unreliable synapses and provides a hypothesis for why some neurological learning deficits are associated with hyperconnectivity. Our analysis is independent of specific learning rules and uncovers fundamental relationships between learning rate, task performance, network size, and intrinsic noise in neural circuits.
  • 关键词:learning ; neural network ; synaptic plasticity ; optimization ; artificial intelligence
国家哲学社会科学文献中心版权所有