QUT ePrints

Overview of the INEX 2010 focused relevance feedback track

Chappell, Timothy & Geva, Shlomo (2011) Overview of the INEX 2010 focused relevance feedback track. In Proceedings of INEX 2011, Springer, Vucht, The Nederlands. (In Press)

[img] workshop paper (PDF 458kB)
Submitted Version.

    Abstract

    The INEX 2010 Focused Relevance Feedback track offered a refined approach to the evaluation of Focused Relevance Feedback algorithms through simulated exhaustive user feedback. As in traditional approaches we simulated a user-in-the loop by re-using the assessments of ad-hoc retrieval obtained from real users who assess focused ad-hoc retrieval submissions. The evaluation was extended in several ways: the use of exhaustive relevance feedback over entire runs; the evaluation of focused retrieval where both the retrieval results and the feedback are focused; the evaluation was performed over a closed set of documents and complete focused assessments; the evaluation was performed over executable implementations of relevance feedback algorithms; and �finally, the entire evaluation platform is reusable. We present the evaluation methodology, its implementation, and experimental results obtained for nine submissions from three participating organisations.

    Impact and interest:

    1 citations in Scopus
    Search Google Scholar™
    0 citations in Web of Science®

    Citation countsare sourced monthly from Scopus and Web of Science® citation databases.

    These databases contain citations from different subsets of available publications and different time periods and thus the citation count from each is usually different. Some works are not in either database and no count is displayed. Scopus includes citations from articles published in 1996 onwards, and Web of Science® generally from 1980 onwards.

    Citations counts from the Google Scholar™ indexing service can be viewed at the linked Google Scholar™ search.

    Full-text downloads:

    56 since deposited on 14 Mar 2011
    14 in the past twelve months

    Full-text downloadsdisplays the total number of times this work’s files (e.g., a PDF) have been downloaded from QUT ePrints as well as the number of downloads in the previous 365 days. The count includes downloads for all files if a work has more than one.

    ID Code: 40723
    Item Type: Conference Paper
    Keywords: Relevance Feedback, Information Retrieval, Search engine, Pseudo Relevance, User Feedback
    Subjects: Australian and New Zealand Standard Research Classification > INFORMATION AND COMPUTING SCIENCES (080000) > ARTIFICIAL INTELLIGENCE AND IMAGE PROCESSING (080100) > Natural Language Processing (080107)
    Australian and New Zealand Standard Research Classification > INFORMATION AND COMPUTING SCIENCES (080000) > COMPUTER SOFTWARE (080300) > Computer Software not elsewhere classified (080399)
    Australian and New Zealand Standard Research Classification > INFORMATION AND COMPUTING SCIENCES (080000) > DATA FORMAT (080400) > Markup Languages (080404)
    Australian and New Zealand Standard Research Classification > INFORMATION AND COMPUTING SCIENCES (080000) > LIBRARY AND INFORMATION STUDIES (080700) > Information Retrieval and Web Search (080704)
    Divisions: Past > Schools > Computer Science
    Past > QUT Faculties & Divisions > Faculty of Science and Technology
    Copyright Owner: Copyright 2011 please consult the authors
    Deposited On: 14 Mar 2011 11:46
    Last Modified: 18 Jul 2014 15:22

    Export: EndNote | Dublin Core | BibTeX

    Repository Staff Only: item control page