{"id":795437,"date":"2021-11-16T08:00:29","date_gmt":"2021-11-16T16:00:29","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-research-item&p=795437"},"modified":"2021-11-16T12:53:56","modified_gmt":"2021-11-16T20:53:56","slug":"research-talk-local-factor-models-for-large-scale-inductive-recommendation","status":"publish","type":"msr-video","link":"https:\/\/www.microsoft.com\/en-us\/research\/video\/research-talk-local-factor-models-for-large-scale-inductive-recommendation\/","title":{"rendered":"Research talk: Local factor models for large-scale inductive recommendation"},"content":{"rendered":"
In many domains, user preferences are similar locally within like-minded subgroups of users, but typically differ globally between those subgroups. Local recommendation models were shown to substantially improve top-k recommendation performance in such settings. However, existing local models do not scale to large-scale datasets with an increasing number of subgroups and do not support inductive recommendations for users not appearing in the training set. Key reasons for this are that subgroup detection and recommendation get implemented as separate steps in the model or that local models are explicitly instantiated for each subgroup. In this talk, we discuss an End-to-end Local Factor Model (ELFM) which overcomes these limitations by combining both steps and incorporating local structures through an inductive bias. Our model can be optimized end-to-end and supports incremental inference, does not require a full separate model for each subgroup, and has overall small memory and computational costs for incorporating local structures. Empirical results show that our method substantially improves recommendation performance on large-scale datasets with millions of users and items with considerably smaller model size. Our user study also shows that our approach produces coherent item subgroups which could aid in the generation of explainable recommendations.<\/p>\n