首页    期刊浏览 2024年12月04日 星期三
登录注册

文章基本信息

  • 标题:Experimental Demonstration of Supervised Learning in Spiking Neural Networks with Phase-Change Memory Synapses
  • 本地全文:下载
  • 作者:S. R. Nandakumar ; Irem Boybat ; Manuel Le Gallo
  • 期刊名称:Scientific Reports
  • 电子版ISSN:2045-2322
  • 出版年度:2020
  • 卷号:10
  • 期号:1
  • 页码:1-11
  • DOI:10.1038/s41598-020-64878-5
  • 出版社:Springer Nature
  • 摘要:Spiking neural networks (SNN) are computational models inspired by the brain’s ability to naturally encode and process information in the time domain. The added temporal dimension is believed to render them more computationally efficient than the conventional artificial neural networks, though their full computational capabilities are yet to be explored. Recently, in-memory computing architectures based on non-volatile memory crossbar arrays have shown great promise to implement parallel computations in artificial and spiking neural networks. In this work, we evaluate the feasibility to realize high-performance event-driven in-situ supervised learning systems using nanoscale and stochastic analog memory synapses. For the first time, the potential of analog memory synapses to generate precisely timed spikes in SNNs is experimentally demonstrated. The experiment targets applications which directly integrates spike encoded signals generated from bio-mimetic sensors with in-memory computing based learning systems to generate precisely timed control signal spikes for neuromorphic actuators. More than 170,000 phase-change memory (PCM) based synapses from our prototype chip were trained based on an event-driven learning rule, to generate spike patterns with more than 85% of the spikes within a 25 ms tolerance interval in a 1250 ms long spike pattern. We observe that the accuracy is mainly limited by the imprecision related to device programming and temporal drift of conductance values. We show that an array level scaling scheme can significantly improve the retention of the trained SNN states in the presence of conductance drift in the PCM. Combining the computational potential of supervised SNNs with the parallel compute power of in-memory computing, this work paves the way for next-generation of efficient brain-inspired systems.
国家哲学社会科学文献中心版权所有