首页    期刊浏览 2024年12月03日 星期二
登录注册

文章基本信息

  • 标题:Attention-Based Joint Entity Linking with Entity Embedding
  • 作者:Chen Liu ; Chen Liu ; Feng Li
  • 期刊名称:Information
  • 电子版ISSN:2078-2489
  • 出版年度:2019
  • 卷号:10
  • 期号:2
  • 页码:46
  • DOI:10.3390/info10020046
  • 语种:English
  • 出版社:MDPI Publishing
  • 摘要:Entity linking (also called entity disambiguation) aims to map the mentions in a given document to their corresponding entities in a target knowledge base. In order to build a high-quality entity linking system, efforts are made in three parts: Encoding of the entity, encoding of the mention context, and modeling the coherence among mentions. For the encoding of entity, we use long short term memory (LSTM) and a convolutional neural network (CNN) to encode the entity context and entity description, respectively. Then, we design a function to combine all the different entity information aspects, in order to generate unified, dense entity embeddings. For the encoding of mention context, unlike standard attention mechanisms which can only capture important individual words, we introduce a novel, attention mechanism-based LSTM model, which can effectively capture the important text spans around a given mention with a conditional random field (CRF) layer. In addition, we take the coherence among mentions into consideration with a Forward-Backward Algorithm, which is less time-consuming than previous methods. Our experimental results show that our model obtains a competitive, or even better, performance than state-of-the-art models across different datasets.
  • 关键词:entity linking; LSTM; CNN; CRF; Forward-Backward Algorithm entity linking ; LSTM ; CNN ; CRF ; Forward-Backward Algorithm
Loading...
联系我们|关于我们|网站声明
国家哲学社会科学文献中心版权所有