News & features
Graphormer wins the Open Catalyst Challenge and upgrades to AI for Molecular Simulation Toolkit
Graphormer is a new generation deep learning model for graph data modeling (with typical graph data including molecular chemical formulas, social networks, etc.) that was proposed by Microsoft Research Asia. Compared with the previous generation of traditional graph neural networks,…
Research at Microsoft 2021: Collaborating for real-world change
Over the past 30 years, Microsoft Research has undergone a shift in how it approaches innovation, broadening its mission to include not only advancing the state of computing but also using technology to tackle some of the world’s most pressing…
Deep neural networks (DNN) have recently achieved remarkable success in various fields. When training these large-scale DNN models, regularization techniques such as L2 Normalization, Batch Normalization, Dropout, etc. are indispensable modules to prevent model overfitting and improve model generalization. Among…
Microsoft Research Asia Establishes Theory Center to Strengthen Theoretical Foundation of AI
On December 11th, Microsoft Research Asia (MSR Asia) held the 2021 Theory Workshop, where theoretical research experts from academia and the industry came together to share their latest research results. At the workshop, MSR Asia announced the establishment of its…
A unified modeling story for artificial intelligence “Unity” is a common goal in many disciplines. For example, in the field of physics, scientists have long been pursuing the grand unification theory, which is a single theory that can be used…
Awards | ICDM 2021
Wei Chen is awarded Highest Impact Paper Award at ICDM’2021
Wei Chen was awarded the 2021 IEEE ICDM 10-Year Highest-Impact Paper Award for his paper, IRIE: Scalable and Robust Influence Maximization in Social Networks – Microsoft Research.
Awards | IEEE
Xing Xie elevated to IEEE Fellow
Dr. Xing Xie was elevated to IEEE Fellow 2022 for his contributions to spatial data mining and recommendation systems. IEEE Fellow is a distinction reserved for select IEEE members whose extraordinary accomplishments in any of the IEEE fields of interest…
Tutel: An efficient mixture-of-experts implementation for large DNN model training
| Wei Cui, Yifan Xiong, Peng Cheng, and Rafael Salas
Mixture of experts (MoE) is a deep learning model architecture in which computational cost is sublinear to the number of parameters, making scaling easier. Nowadays, MoE is the only approach demonstrated to scale deep learning models to trillion-plus parameters, paving…
In the news | Microsoft Translator Blog
Multilingual translation at scale: 10000 language pairs and beyond
Microsoft is on a quest for AI at Scale with high ambition to enable the next generation of AI experiences. The Microsoft Translator ZCode team is working together with Microsoft Project Turing and Microsoft Research Asia to advance language and…