首页    期刊浏览 2024年12月13日 星期五
登录注册

文章基本信息

  • 标题:Summary of Research Methods on Pre-Training Models of Natural Language Processing
  • 本地全文:下载
  • 作者:Yu Xiao ; Zhezhi Jin
  • 期刊名称:Open Access Library Journal
  • 印刷版ISSN:2333-9705
  • 电子版ISSN:2333-9721
  • 出版年度:2021
  • 卷号:8
  • 期号:7
  • 页码:1-7
  • DOI:10.4236/oalib.1107602
  • 语种:English
  • 出版社:Scientific Research Pub
  • 摘要:In recent years, deep learning technology has been widely used and developed. In natural language processing tasks, pre-training models have been more widely used. Whether it is sentence extraction or sentiment analysis of text, the pre-training model plays a very important role. The use of a large-scale corpus for unsupervised pre-training of models has proven to be an excellent and effective way to provide models. This article summarizes the existing pre-training models and sorts out the improved models and processing methods of the relatively new pre-training models, and finally summarizes the challenges and prospects of the current pre-training models.
  • 关键词:Natural Language ProcessingPre-Training ModelLanguage ModelSelf-Training Model
国家哲学社会科学文献中心版权所有