Using CrowdLogger for in situ information retrieval system evaluation.

CIKM(2013)

引用 4|浏览60
暂无评分
摘要
ABSTRACTA major hurdle faced by many information retrieval researchers---especially in academia---is evaluating retrieval systems in the wild. Challenges include tapping into large user bases, collecting user behavior, and modifying a given retrieval system. We outline several options available to researchers to overcome these challenges along with their advantages and disadvantages. We then demonstrate how CrowdLogger, an open-source browser extension for Firefox and Google Chrome, can be used as an in situ evaluation platform.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要