Semiautomatic Determination of Citation Relevancy: User Evaluation
Library and Information Science
Online bibliographic, database searches typically produce hundreds of retrieved citations with only about 20–40% relevant to the search topic and/or problem statement. Significant amounts of time are required to categorize and select the relevant citations. A software system—SORT-AID/SABRE—has been developed which ranks the citations in terms of relevance. This paper presents the results of a comprehensive user evaluation of the relevance ranking procedures. Test results show that the software generated distributions approach the ideal distribution—all relevant citations at the beginning of the collection—in 22% of the cases, are 23% better than the random distribution—relevant citations distributed uniformly throughout the collection—on average and are poorer than the random distribution in 4% of the cases.
Information Processing and Management
Huffman, G. D.
(1990). Semiautomatic Determination of Citation Relevancy: User Evaluation. Information Processing and Management, 26(2), 295-302.
Available at: https://aquila.usm.edu/fac_pubs/7363