{"version":"1.0","provider_name":"Microsoft Research","provider_url":"https:\/\/www.microsoft.com\/en-us\/research","author_name":"Subhabrata (Subho) Mukherjee","author_url":"https:\/\/www.microsoft.com\/en-us\/research\/people\/submukhe\/","title":"Massive Distillation of Multilingual BERT (35x Compression, 51x Speedup)","type":"rich","width":600,"height":338,"html":"
XtremeDistil: Multi-stage Distillation for Massive Multilingual Models<\/a><\/blockquote>