{"id":765364,"date":"2021-08-06T19:23:22","date_gmt":"2021-08-07T02:23:22","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-project&p=765364"},"modified":"2023-02-22T08:52:56","modified_gmt":"2023-02-22T16:52:56","slug":"project-zcode","status":"publish","type":"msr-project","link":"https:\/\/www.microsoft.com\/en-us\/research\/project\/project-zcode\/","title":{"rendered":"Project Z-Code"},"content":{"rendered":"
\n\t
\n\t\t
\n\t\t\t\"Z-code\t\t<\/div>\n\t\t\n\t\t
\n\t\t\t\n\t\t\t
\n\t\t\t\t\n\t\t\t\t
\n\t\t\t\t\t\n\t\t\t\t\t
\n\t\t\t\t\t\t
\n\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\n\n

Project Z-Code<\/h1>\n\n\t\t\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/div>\n<\/section>\n\n\n\n\n\n

Project Z-Code is a component of Microsoft\u2019s larger XYZ-code initiative (opens in new tab)<\/span><\/a> to combine AI models for text, vision, audio, and language. Z-code supports the creation of AI systems that can speak, see, hear, and understand. This effort is a part of Azure AI (opens in new tab)<\/span><\/a> and Project Turing (opens in new tab)<\/span><\/a>, focusing on building multilingual, large-scale language models that support various production teams to evolve Microsoft products with the adoption of deep learning pre-trained models.<\/p>\n\n\n\n

\"Z-code
Z-Code models empower Microsoft Translator, Azure Cognitive Language Services and several products and customers at Microsoft with multilingual capabilities at scale.<\/figcaption><\/figure>\n\n\n\n

XYZ-Code<\/a> combines three attributes of human cognition: monolingual text (X), audio or visual sensory signals (Y), and multilingual (Z) to create a joint representation enabling more powerful AI applications that can speak, hear, see, and understand humans better. We believe XYZ-code will enable us to fulfill our long-term vision: cross-domain transfer learning, spanning modalities and languages. The goal is to have pretrained models that can jointly learn representations to support a broad range of downstream AI tasks, much in the way humans do today. This can combine representation across languages and across modalities to drive cognitive services.<\/p>\n\n\n\n

\"Venn
XYZ-Code<\/figcaption><\/figure>\n\n\n\n

<\/p>\n\n\n\n

Z-Code, as a part of  Project Turing (opens in new tab)<\/span><\/a>, realizing Micrsoft AI at Scale (opens in new tab)<\/span><\/a> vision to empower Microsoft’s products and customers with large-scale multilingual pre-trained models to support a variety of applications. The project is focusing on various areas of the technology stack to scale AI models. We work on scaling up training infrastructure and frameworks such that we can enable training models with 100s billions of parameters on trillions of training examples in the most efficient and scalable setup. We work on fundamental modeling improvements that can enable cross-lingual and cross-domain transfer learning for hundreds of languages and various downstream tasks. Furthermore, we work on efficient runtime frameworks that enable cost efficient deployment of such large-scale models to serve various production scenarios in a more sustainable manner.<\/p>\n\n\n\n

\"Microsoft
Microsoft AI at Scale Technology Stack<\/figcaption><\/figure>\n\n\n\n

<\/p>\n\n\n\n

<\/p>\n\n\n\n

Z-Code, as the multilingual representation in XYZ-Code, is a general purpose pre-trained Multilingual, Multi-Task Text-to-Text Transformation model. In Z-code, we train the models on multiple tasks at the same time. Because of transfer learning, and sharing across similar languages, we have dramatically improved quality, reduced costs, and improved efficiency with less data. Now, we can use Z-code to improve translation and general natural language understanding tasks, such as multilingual named entity extraction. Z-code helped us deliver our embedded universal language regardless of what language people are speaking. As we like to say, Z-code is \u201cborn to be multilingual.\u201d<\/p>\n\n\n\n

The model is trained on multiple tasks and multiple data sources to empower several production scenarios across Microsoft. Z-code takes advantage of shared linguistic elements across multiple languages via transfer learning \u2014which applies knowledge from one task to another related task \u2014 to improve quality for machine translation and other language understanding tasks. It also helps extend those capabilities beyond the most common languages across the globe to underrepresented languages that have less available training data.  Z-Code has a family of models covering both encoders only models and encoders-decoders generative models. The models empower various production scenarios across Microsoft with multilingual capabilities including Microsoft Translator, Azure Cognitive Services for Languages as well as several products scenarios in Teams, Office, and more.<\/p>\n\n\n\n

<\/p>\n\n\n\n

\"iagram
Diagram of Z-code architecture. Z-code uses transfer learning in two ways. First, the model is trained multilingually across many languages, such that knowledge is transferred between languages. Second, we use multi-task training so that we transfer knowledge between tasks. For example, the machine translation (MT) task can help the natural language understanding task, the masked LM (MLM) task or the denoising autoencoder (DAE) task can help the MT task and so on.<\/figcaption><\/figure>\n\n\n\n

<\/p>\n\n\n\n\n\n

<\/p>\n\n\n\n\n\n

<\/p>\n\n\n\n

<\/p>\n","protected":false},"excerpt":{"rendered":"

Project Z-Code, a part of Azure AI Cognitive Services,\u00a0is working within Project Turing to evolve Microsoft products with the adoption of deep learning pre-trained models.<\/p>\n","protected":false},"featured_media":827977,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","footnotes":""},"research-area":[13556,13545],"msr-locale":[268875],"msr-impact-theme":[],"msr-pillar":[],"class_list":["post-765364","msr-project","type-msr-project","status-publish","has-post-thumbnail","hentry","msr-research-area-artificial-intelligence","msr-research-area-human-language-technologies","msr-locale-en_us","msr-archive-status-active"],"msr_project_start":"2020-01-01","related-publications":[828124,828148,656556,853389,707743,856782,709921,869667,766642,870231,766648,921810,777727,921831,777742,782821,828112],"related-downloads":[],"related-videos":[],"related-groups":[],"related-events":[],"related-opportunities":[],"related-posts":[],"related-articles":[],"tab-content":[{"id":0,"name":"Announcements","content":"

What's new in the Text Analytics<\/a><\/h4>\r\n

Summarize text with Text Analytics API<\/a><\/h4>\r\n

GA of Text Analytics for health, Opinion Mining, PII and Analyze<\/a><\/h4>\r\n \r\n\r\n "}],"slides":[],"related-researchers":[{"type":"guest","display_name":"Mohamed Afify","user_id":765499,"people_section":"Members","alias":""},{"type":"guest","display_name":"Hossam Amer","user_id":765505,"people_section":"Members","alias":""},{"type":"guest","display_name":"Raffy Bekheit","user_id":766627,"people_section":"Members","alias":""},{"type":"guest","display_name":"Muhammad Elnokrashy","user_id":765508,"people_section":"Members","alias":""},{"type":"user_nicename","display_name":"Akiko Eriguchi","user_id":41856,"people_section":"Members","alias":"akikoe"},{"type":"guest","display_name":"Mohamed Gabr","user_id":765466,"people_section":"Members","alias":""},{"type":"user_nicename","display_name":"Hany Hassan Awadalla","user_id":31965,"people_section":"Members","alias":"hanyh"},{"type":"guest","display_name":"Amr Hendy","user_id":765463,"people_section":"Members","alias":""},{"type":"user_nicename","display_name":"Young Jin Kim","user_id":37646,"people_section":"Members","alias":"youki"},{"type":"guest","display_name":"Amr Sharaf","user_id":869688,"people_section":"Members","alias":""},{"type":"guest","display_name":"Yiren Wang","user_id":869697,"people_section":"Members","alias":""},{"type":"user_nicename","display_name":"Yaobo Liang","user_id":36036,"people_section":"Collaborators","alias":"yalia"},{"type":"user_nicename","display_name":"Shuming Ma","user_id":39706,"people_section":"Collaborators","alias":"shumma"},{"type":"user_nicename","display_name":"Furu Wei","user_id":31830,"people_section":"Collaborators","alias":"fuwei"}],"msr_research_lab":[],"msr_impact_theme":[],"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/765364"}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-project"}],"version-history":[{"count":34,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/765364\/revisions"}],"predecessor-version":[{"id":921837,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/765364\/revisions\/921837"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media\/827977"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=765364"}],"wp:term":[{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=765364"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=765364"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=765364"},{"taxonomy":"msr-pillar","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-pillar?post=765364"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}