Inproceedings,

Implications of Inter-Rater Agreement on a Student Information Retrieval Evaluation

, , and .
Proceedings of LWA2010 - Workshop-Woche: Lernen, Wissen + Adaptivität, Kassel, Germany, (2010)

Abstract

This paper is about an information retrieval evaluation on three different retrieval-supporting services. All three services were designed to compensate typical problems that arise in metadata-driven Digital Libraries, which are not ade- quately handled by a simple tf-idf based retrieval. The services are: (1) a co-word analysis based query expansion mechanism and re-ranking via (2) Bradfordizing and (3) author centrality. The services are evaluated with relevance assessments conducted by 73 information science students. Since the students are neither information professionals nor domain experts the question of inter-rater agreement is taken into consideration. Two important implications emerge: (1) the inter-rater agreement rates were mainly fair to moderate and (2) after a data-cleaning step which erased the assessments with poor agreement rates the evaluation data shows that the three retrieval services returned disjoint but still relevant result sets.

Tags

Users

  • @lwa2010
  • @irgroup_thkoeln
  • @schaer
  • @dblp

Comments and Reviews