TY - GEN
T1 - Active feedback in ad hoc information retrieval
AU - Shen, Xuehua
AU - Zhai, Chengxiang
PY - 2005
Y1 - 2005
N2 - Information retrieval is, in general, an iterative search process, in which the user often has several interactions with a retrieval system for an information need. The retrieval system can actively probe a user with questions to clarify the information need instead of just passively responding to user queries. A basic question is thus how a retrieval system should propose questions to the user so that it can obtain maximum benefits from the feedback on these questions. In this paper, we study how a retrieval system can perform active feedback, i.e., how to choose documents for relevance feedback so that the system can learn most from the feedback information. We present a general framework for such an active feedback problem, and derive several practical algorithms as special cases. Empirical evaluation of these algorithms shows that the performance of traditional relevance feedback (presenting the top K documents) is consistently worse than that of presenting documents with more diversity. With a diversity-based selection algorithm, we obtain fewer relevant documents, however, these fewer documents have more learning benefits.
AB - Information retrieval is, in general, an iterative search process, in which the user often has several interactions with a retrieval system for an information need. The retrieval system can actively probe a user with questions to clarify the information need instead of just passively responding to user queries. A basic question is thus how a retrieval system should propose questions to the user so that it can obtain maximum benefits from the feedback on these questions. In this paper, we study how a retrieval system can perform active feedback, i.e., how to choose documents for relevance feedback so that the system can learn most from the feedback information. We present a general framework for such an active feedback problem, and derive several practical algorithms as special cases. Empirical evaluation of these algorithms shows that the performance of traditional relevance feedback (presenting the top K documents) is consistently worse than that of presenting documents with more diversity. With a diversity-based selection algorithm, we obtain fewer relevant documents, however, these fewer documents have more learning benefits.
KW - active feedback
KW - ad hoc information retrieval
UR - http://www.scopus.com/inward/record.url?scp=84885668422&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84885668422&partnerID=8YFLogxK
U2 - 10.1145/1076034.1076047
DO - 10.1145/1076034.1076047
M3 - Conference contribution
AN - SCOPUS:84885668422
SN - 1595930345
SN - 9781595930347
T3 - SIGIR 2005 - Proceedings of the 28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval
SP - 59
EP - 66
BT - SIGIR 2005 - Proceedings of the 28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval
T2 - 28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2005
Y2 - 15 August 2005 through 19 August 2005
ER -