首页    期刊浏览 2024年12月04日 星期三
登录注册

文章基本信息

  • 标题:Topic-Transformer for Document-Level Language Understanding
  • 本地全文:下载
  • 作者:Oumaima Hourrane ; El Habib Benlahmar
  • 期刊名称:Journal of Computer Science
  • 印刷版ISSN:1549-3636
  • 出版年度:2022
  • 卷号:18
  • 期号:1
  • 页码:18-25
  • DOI:10.3844/jcssp.2022.18.25
  • 语种:English
  • 出版社:Science Publications
  • 摘要:As long as natural language processing applications are considered prediction problems with insufficient context, usually referred to as a single sentence or paragraph, this does not reveal how humans perceive natural language. When reading a text, humans are sensitive to much more context, such as the rest or other relevant documents. This study focuses on simultaneously capturing syntax and global semantics from a text, thus acquiring document-level understanding. Accordingly, we introduce a Topic-Transformer that combines the benefits of a neural topic model that captures global semantic information and a transformer-based language model, which can capture the local structure of texts both semantically and syntactically. Experiments on various datasets confirm that our model has a lower perplexity metric compared to standard transformer architecture and the recent topic-guided language models and generates topics that are conceivably coherent compared to those of regular Latent Dirichlet Allocation (LDA) topic model.
  • 关键词:Neural Topic Model;Neural Language Model;Topic-Guided Language Model;Document-Level Understanding;Long-Range Semantic Dependencies
国家哲学社会科学文献中心版权所有