An evaluation of multiple query representations for the relevance judgments used to build a biomedical test collection

Borim Ryu, Jinwook Choi

Research output: Contribution to journalArticle

2 Scopus citations

Abstract

Objectives: The purpose of this study is to validate a method that uses multiple queries to create a set of relevance judgments used to indicate which documents are pertinent to each query when forming a biomedical test collection. Methods: The aspect query is the major concept of this research; it can represent every aspect of the original query with the same informational need. Manually generated aspect queries created by 15 recruited participants where run using the BM25 retrieval model in order to create aspect query based relevance sets (QRELS). In order to demonstrate the feasibility of these QRELSs, The results from a 2004 genomics track run supported by the National Institute of Standards and Technology (NIST) were used to compute the mean average precision (MAP) based on Text Retrieval Conference (TREC) QRELSs and aspect-QRELSs. The rank correlation was calculated using both Kendall's and Spearman's rank correlation methods. Results: We experimentally verified the utility of the aspect query method by combining the top ranked documents retrieved by a number of multiple queries which ranked the order of the information. The retrieval system correlated highly with rankings based on human relevance judgments. Conclusions: Substantial results were shown with high correlations of up to 0.863 (p < 0.01) between the judgment-free gold standard based on the aspect queries and the human-judged gold standard supported by NIST. The results also demonstrate that the aspect query method can contribute in building test collections used for medical literature retrieval.

Original languageEnglish
Pages (from-to)65-73
Number of pages9
JournalHealthcare Informatics Research
Volume18
Issue number1
DOIs
StatePublished - 1 Dec 2012

Keywords

  • Correlation studies
  • Evaluation studies
  • Gold standard
  • Information storage and retrieval
  • MEDLINE

Cite this