首页    期刊浏览 2024年11月30日 星期六
登录注册

文章基本信息

  • 标题:BERTese: Learning to Speak toBERT
  • 本地全文:下载
  • 作者:Adi Haviv ; Jonathan Berant ; Amir Globerson
  • 期刊名称:Conference on European Chapter of the Association for Computational Linguistics (EACL)
  • 出版年度:2021
  • 卷号:2021
  • 页码:3618-3623
  • DOI:10.18653/v1/2021.eacl-main.316
  • 语种:English
  • 出版社:ACL Anthology
  • 摘要:Large pre-trained language models have been shown to encode large amounts of world and commonsense knowledge in their parameters, leading to substantial interest in methods for extracting that knowledge. In past work, knowledge was extracted by taking manually-authored queries and gathering paraphrases for them using a separate pipeline. In this work, we propose a method for automatically rewriting queries into “BERTese”, a paraphrase query that is directly optimized towards better knowledge extraction. To encourage meaningful rewrites, we add auxiliary loss functions that encourage the query to correspond to actual language tokens. We empirically show our approach outperforms competing baselines, obviating the need for complex pipelines. Moreover, BERTese provides some insight into the type of language that helps language models perform knowledge extraction.
国家哲学社会科学文献中心版权所有