首页    期刊浏览 2024年12月13日 星期五
登录注册

文章基本信息

  • 标题:Syntax-BERT: Improving Pre-trained Transformers with Syntax Trees
  • 本地全文:下载
  • 作者:Jiangang Bai ; Yujing Wang ; Yiren Chen
  • 期刊名称:Conference on European Chapter of the Association for Computational Linguistics (EACL)
  • 出版年度:2021
  • 卷号:2021
  • 页码:3011-3020
  • DOI:10.18653/v1/2021.eacl-main.262
  • 语种:English
  • 出版社:ACL Anthology
  • 摘要:Pre-trained language models like BERT achieve superior performances in various NLP tasks without explicit consideration of syntactic information. Meanwhile, syntactic information has been proved to be crucial for the success of NLP applications. However, how to incorporate the syntax trees effectively and efficiently into pre-trained Transformers is still unsettled. In this paper, we address this problem by proposing a novel framework named Syntax-BERT. This framework works in a plug-and-play mode and is applicable to an arbitrary pre-trained checkpoint based on Transformer architecture. Experiments on various datasets of natural language understanding verify the effectiveness of syntax trees and achieve consistent improvement over multiple pre-trained models, including BERT, RoBERTa, and T5.
国家哲学社会科学文献中心版权所有