Skip navigation

Evaluating aggregated search pages

Evaluating aggregated search pages

Zhou, Ke, Cummins, Ronan, Lalmas, Mounia and Jose, Joemon M. (2012) Evaluating aggregated search pages. In: SIGIR '12 Proceedings of the 35th International ACM SIGIR Conference on Research and Development in Information Retrieval. ACM, New York, USA, pp. 115-124. ISBN 978-1450314725 (doi:https://doi.org/10.1145/2348283.2348302)

Full text not available from this repository. (Request a copy)

Abstract

Aggregating search results from a variety of heterogeneous sources or verticals such as news, image and video into a single interface is a popular paradigm in web search. Although various approaches exist for selecting relevant verticals or optimising the aggregated search result page, evaluating the quality of an aggregated page is an open question. This paper proposes a general framework for evaluating the quality of aggregated search pages. We evaluate our approach by collecting annotated user preferences over a set of aggregated search pages for 56 topics and 12 verticals. We empirically demonstrate the fidelity of metrics instantiated from our proposed framework by showing that they strongly agree with the annotated user preferences of pairs of simulated aggregated pages. Furthermore, we show that our metrics agree with the majority user preference more often than the current diversity-based information retrieval metrics. Finally, we demonstrate the flexibility of our framework by showing that personalised historical preference data can improve the performance of our proposed metrics.

Item Type: Conference Proceedings
Title of Proceedings: SIGIR '12 Proceedings of the 35th International ACM SIGIR Conference on Research and Development in Information Retrieval
Additional Information: [1] First published: 2012. [2] Published as: Zhou, Ke, Cummins, Ronan, Lalmas, Mounia and Jose, Joemon M. (2012) Evaluating aggregated search pages. In: SIGIR '12 Proceedings of the 35th International ACM SIGIR Conference on Research and Development in Information Retrieval. ACM, New York, NY, USA, pp. 115-124. [3] This paper was first presented at SIGIR '12, The 35th International ACM SIGIR Conference on Research and Development in Information Retrieval, held from 12-16 August 2012 in Portland, Oregon, USA. The paper was given on 13 August 2012 within the Evaluation 1 session. [4] SIGIR (Special Interest Group on Information Retrieval) is a Special Interest Group of the Association of Computing Machinery (ACM).
Uncontrolled Keywords: measurement, experimentation, aggregated search, evaluation, performance metric, diversity
Subjects: Q Science > QA Mathematics > QA76 Computer software
Z Bibliography. Library Science. Information Resources > ZA Information resources > ZA4050 Electronic information resources
Pre-2014 Departments: School of Computing & Mathematical Sciences
Related URLs:
Last Modified: 25 Sep 2019 09:55
URI: http://gala.gre.ac.uk/id/eprint/10095

Actions (login required)

View Item View Item