首页    期刊浏览 2024年12月15日 星期日
登录注册

文章基本信息

  • 标题:Relevance Judgments Exclusive of Human Assessors in Large Scale Information Retrieval Evaluation Experimentation
  • 本地全文:下载
  • 作者:Prabha Rajagopal ; Sri Devi Ravana ; Maizatul Akmar Ismail
  • 期刊名称:Malaysian Journal of Computer Science
  • 印刷版ISSN:0127-9084
  • 出版年度:2014
  • 卷号:27
  • 期号:2
  • 出版社:University of Malaya * Faculty of Computer Science and Information Technology
  • 摘要:Inconsistent judgments by various human assessors’ compromises the reliability of the relevance judgments generated for large scale test collections. An automated method that creates a similar set of relevance judgments (pseudo relevance judgments) that eliminate the human efforts and errors introduced in creating relevance judgments is investigated in this study. Traditionally, the participating systems in TREC are measured by using a chosen metrics and ranked according to its performance scores. In order to generate these scores, the documents retrieved by these systems for each topic are matched with the set of relevance judgments (often assessed by humans). In this study, the number of occurrences of each document per topic from the various runs will be used with an assumption, the higher the number of occurrences of a document, the possibility of the document being relevant is higher. The study proposesa method with a pool depth of 100 using the cutoff percentage of >35% that could provide an alternate way of generating consistent relevance judgments without the involvement of human assessors.
  • 关键词:Information retrieval; relevance judgments; retrieval evaluation; large scale experimentation
国家哲学社会科学文献中心版权所有