Improving Query Representations For Dense Retrieval With Pseudo Relevance Feedback | Awesome Learning to Hash Add your paper to Learning2Hash

Improving Query Representations For Dense Retrieval With Pseudo Relevance Feedback

Hongchien Yu, Chenyan Xiong, Jamie Callan . Proceedings of the 30th ACM International Conference on Information & Knowledge Management 2021 – 50 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
CIKM Datasets Distance Metric Learning

Dense retrieval systems conduct first-stage retrieval using embedded representations and simple similarity metrics to match a query to documents. Its effectiveness depends on encoded embeddings to capture the semantics of queries and documents, a challenging task due to the shortness and ambiguity of search queries. This paper proposes ANCE-PRF, a new query encoder that uses pseudo relevance feedback (PRF) to improve query representations for dense retrieval. ANCE-PRF uses a BERT encoder that consumes the query and the top retrieved documents from a dense retrieval model, ANCE, and it learns to produce better query embeddings directly from relevance labels. It also keeps the document index unchanged to reduce overhead. ANCE-PRF significantly outperforms ANCE and other recent dense retrieval systems on several datasets. Analysis shows that the PRF encoder effectively captures the relevant and complementary information from PRF documents, while ignoring the noise with its learned attention mechanism.

Similar Work