Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Retrieval System Evaluation

Published

Author(s)

C E. Buckley, Ellen M. Voorhees

Abstract

One of the primary motivations for TREC was to standardize retrieval system evaluation. Prior to TREC, there was little explicit discussion of what constituted a minimally acceptable experimental design, and no hard evidence to support any position. TREC has succeeded in standardizing ad hoc retrieval evaluation, has validated the reliability of experiments based on test collections, and has empirically determined bounds on the sensitivity of test collection comparisons. A focus on evaluation in tracks where the result is not a ranked list of documents has extended the paradigm to new tasks.
Citation
Retrieval System Evaluation
Publisher Info
TREC Chapter to be published: TREC: Experiment and Evaluation in Information Retrieval, 2005,

Keywords

evaluation, information retrieval, TREC

Citation

Buckley, C. and Voorhees, E. (2005), Retrieval System Evaluation, TREC Chapter to be published: TREC: Experiment and Evaluation in Information Retrieval, 2005, (Accessed December 4, 2024)

Issues

If you have any questions about this publication or are having problems accessing it, please contact reflib@nist.gov.

Created September 25, 2005, Updated October 12, 2021