首页    期刊浏览 2025年03月04日 星期二
登录注册

文章基本信息

  • 标题:層の削除と再学習によるResNetのモデル圧縮
  • 本地全文:下载
  • 作者:井田 安俊 ; 藤原 靖宏
  • 期刊名称:人工知能学会論文誌
  • 印刷版ISSN:1346-0714
  • 电子版ISSN:1346-8030
  • 出版年度:2020
  • 卷号:35
  • 期号:3
  • 页码:1-10
  • DOI:10.1527/tjsai.C-JA3
  • 出版社:The Japanese Society for Artificial Intelligence
  • 摘要:Residual Networks with convolutional layers are widely used in the field of machine learning. Since they effectively extract features from input data by stacking multiple layers, they can achieve high accuracy in many applications. However, the stacking of many layers raises their computation costs. To address this problem, we propose Network Implosion, it erases multiple layers from Residual Networks without degrading accuracy. Our key idea is to introduce a priority term that identifies the importance of a layer; we can select unimportant layers according to the priority and erase them after the training. In addition, we retrain the networks to avoid critical drops in accuracy after layer erasure. Our experiments show that Network Implosion can, for classification on CIFAR10/100 and ImageNet, reduce the number of layers by 24.00% ~ 42.86% without any drop in accuracy.
  • 关键词:deep learning;model compression;residual networks;deep neural networks
国家哲学社会科学文献中心版权所有