首页    期刊浏览 2024年12月11日 星期三
登录注册

文章基本信息

  • 标题:Debugging Translations of Transformer-based Neural Machine Translation Systems
  • 本地全文:下载
  • 作者:Matīss Rikters ; Mārcis Pinnis
  • 期刊名称:Baltic Journal of Modern Computing
  • 印刷版ISSN:2255-8942
  • 电子版ISSN:2255-8950
  • 出版年度:2018
  • 卷号:6
  • 期号:4
  • 页码:1-15
  • DOI:10.22364/bjmc.2018.6.4.06
  • 出版社:Vilnius University, University of Latvia, Latvia University of Agriculture, Institute of Mathematics and Informatics of University of Latvia
  • 摘要:In this paper, we describe a tool for debugging the output and attention weights of neural machine translation (NMT) systems and for improved estimations of confidence about the output based on the attention. We dive deeper into ways for it to handle output from transformerbased NMT models. Its purpose is to help researchers and developers find weak and faulty translations that their NMT systems produce without the need for reference translations. We present a demonstration website of our tool with examples of good and bad translations: http: //attention.lielakeda.lv.
  • 关键词:Neural Machine Translation; Attention Mechanism; Transformer Models
国家哲学社会科学文献中心版权所有