Authors
Rishabh Mehrotra,
Emine Yilmaz,
Publication date
2017
Publisher
Total citations
Description
Continuous space word embedding have been shown to be highly effective in many information retrieval tasks. Embedding representation models make use of local information available in immediately surrounding words to project nearby context words closer in the embedding space. With rising multi-tasking nature of web search sessions, users often try to accomplish different tasks in a single search session. Consequently, the search context gets polluted with queries from different unrelated tasks which renders the context heterogeneous. In this work, we hypothesize that task information provides better context for IR systems to learn from. We propose a novel task context embedding architecture to learn representation of queries in low-dimensional space by leveraging their task context information from historical search logs using neural embedding models. In addition to qualitative analysis, we empirically …