{"version":"1.0","provider_name":"Microsoft Research","provider_url":"https:\/\/www.microsoft.com\/en-us\/research","author_name":"Brenda Potts","author_url":"https:\/\/www.microsoft.com\/en-us\/research\/people\/v-brpotts\/","title":"NewsBERT: Distilling Pre-trained Language Model for Intelligent News Application - Microsoft Research","type":"rich","width":600,"height":338,"html":"
NewsBERT: Distilling Pre-trained Language Model for Intelligent News Application<\/a><\/blockquote>