{"version":"1.0","provider_name":"Microsoft Research","provider_url":"https:\/\/www.microsoft.com\/en-us\/research","author_name":"Lexie Hagen","author_url":"https:\/\/www.microsoft.com\/en-us\/research\/people\/v-lexiehagen\/","title":"DeepSpeed powers 8x larger MoE model training with high performance - Microsoft Research","type":"rich","width":600,"height":338,"html":"
DeepSpeed powers 8x larger MoE model training with high performance<\/a><\/blockquote>