Neural Models for Information Retrieval

In the last few years, neural representation learning approaches have achieved very good performance on many natural language processing (NLP) tasks, such as language modeling and machine translation. This suggests that neural models may also yield significant performance improvements on information retrieval (IR) tasks, such as relevance ranking, addressing the query-document vocabulary mismatch problem by using semantic rather than lexical matching. IR tasks, however, are fundamentally different from NLP tasks leading to new challenges and opportunities for existing neural representation learning approaches for text.

In this talk, I will present my recent work on neural IR models. We begin with a discussion on learning good representations of text for retrieval. I will present visual intuitions about how different embeddings spaces capture different relationships between items and their usefulness to different types of IR tasks. The second part of this talk is focused on the applications of deep neural architectures to the document ranking task.

Presentation slides >

发言人详细信息

Bhaskar Mitra is a Principal Applied Scientist at Microsoft AI & Research, Cambridge. He started at Bing in 2007 (then called Live Search) working on several problems related to document ranking, query formulation, entity ranking, and evaluation. His current research interests include representation learning and neural networks, and their applications to information retrieval. He co-organized multiple workshops (at SIGIR 2016 and 2017) and tutorials (at WSDM2017 and SIGIR 2017) on neural IR, and served as a guest editor for the special issue of the Information Retrieval Journal on the same topic. He is currently pursuing a doctorate at University College London under the supervision of Dr. Emine Yilmaz and Dr. David Barber.

日期:
演讲者:
Bhaskar Mitra
所属机构:
Microsoft

系列: Microsoft Research Talks