首页    期刊浏览 2024年12月02日 星期一
登录注册

文章基本信息

  • 标题:Transfer Learning based Performance Comparison of the Pre-Trained Deep Neural Networks
  • 本地全文:下载
  • 作者:Jayapalan Senthil Kumar ; Syahid Anuar ; Noor Hafizah Hassan
  • 期刊名称:International Journal of Advanced Computer Science and Applications(IJACSA)
  • 印刷版ISSN:2158-107X
  • 电子版ISSN:2156-5570
  • 出版年度:2022
  • 卷号:13
  • 期号:1
  • DOI:10.14569/IJACSA.2022.0130193
  • 语种:English
  • 出版社:Science and Information Society (SAI)
  • 摘要:Deep learning has grown tremendously in recent years, having a substantial impact on practically every discipline. Transfer learning allows us to transfer the knowledge of a model that has been formerly trained for a particular task to a new model that is attempting to solve a related but not identical problem. Specific layers of a pre-trained model must be retrained while the others must remain unmodified to adapt it to a new task effectively. There are typical issues in selecting the layers to be enabled for training and layers to be frozen, setting hyper-parameter values, and all these concerns have a substantial effect on training capabilities as well as classification performance. The principal aim of this study is to compare the network performance of the selected pre-trained models based on transfer learning to help the selection of a suitable model for image classifica-tion. To accomplish the goal, we examined the performance of five pre-trained networks, such as SqueezeNet, GoogleNet, ShuffleNet, Darknet-53, and Inception-V3 with different Epochs, Learning Rates, and Mini-Batch Sizes to compare and evaluate the network’s performance using confusion matrix. Based on the experimental findings, Inception-V3 has achieved the highest accuracy of 96.98%, as well as other evaluation metrics, including precision, sensitivity, specificity, and f1-score of 92.63%, 92.46%, 98.12%, and 92.49%, respectively.
  • 关键词:Transfer learning; deep neural networks; image classification; Convolutional Neural Network (CNN) models
国家哲学社会科学文献中心版权所有