{"id":1068003,"date":"2024-08-14T06:39:23","date_gmt":"2024-08-14T13:39:23","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-project&p=1068003"},"modified":"2024-08-14T09:29:33","modified_gmt":"2024-08-14T16:29:33","slug":"african-languages-research","status":"publish","type":"msr-project","link":"https:\/\/www.microsoft.com\/en-us\/research\/project\/african-languages-research\/","title":{"rendered":"African Languages Research"},"content":{"rendered":"
\n\t
\n\t\t
\n\t\t\t\"Creatives\t\t<\/div>\n\t\t\n\t\t
\n\t\t\t\n\t\t\t
\n\t\t\t\t\n\t\t\t\t
\n\t\t\t\t\t\n\t\t\t\t\t
\n\t\t\t\t\t\t
\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\t\t<\/span>\n\t\t\t\t\t\t\t\t\tMicrosoft Research Africa\t\t\t\t\t\t\t\t<\/a>\n\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\n\n

African Languages Research<\/h1>\n\n\n\n

Bridging technological and community divides by enriching Generative AI research with the rich diversity of African languages. Join us in shaping a more inclusive digital future<\/p>\n\n\t\t\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/div>\n<\/section>\n\n\n\n\n\n

Microsoft Research Africa is pioneering advancements in Generative AI through our African Languages Research initiative, focusing on advancing Large Language Models (LLMs) for African languages. Our goal is to enhance language accessibility and ensure AI tools can effectively engage with Africa’s linguistic diversity, supporting both language preservation and the integration of African languages into the digital world.<\/p>\n\n\n\n

Current work<\/h3>\n\n\n\n

The deployment of Large Language Models (LLMs) in real-world applications presents both opportunities and challenges, particularly in multilingual and code-mixed communication settings. Currently at MSR Africa, we are evaluating the performance of leading LLMs, including OpenAI’s GPT series, LLMs from Meta, Mistral AI, and Google. Our study employs a distinctive WhatsApp dataset originally collected as part of a health research project run by the University of Washington’s Global Health Department. It entails multilingual conversations in English, Swahili, Sheng and code-mixing among young people living with HIV in informal settlements in Nairobi, Kenya, captured within two health-focused WhatsApp, with the chats moderated by a medical facilitator. We aim to assess how these LLMs handle culturally nuanced, multilingual, and code-mixed communications on a sentiment analysis task with the goal of supporting the facilitator, for instance, in flagging negative messages in the two WhatsApp chats.<\/p>\n\n\n\n

Internal collaboration with MSR India<\/h3>\n\n\n\n

In collaboration with Microsoft Research Lab – India (opens in new tab)<\/span><\/a>, we’ve introduced MEGAVERSE (NACCL2024 Publication (opens in new tab)<\/span><\/a>), an extension of the MEGA (EMNLP2023 Publication (opens in new tab)<\/span><\/a>) framework, to evaluate non-English and multimodal capabilities of state-of-the-art LLMs. Covering 83 languages and utilizing multimodal datasets, our research highlights the superiority of larger models in processing low-resource languages including African Languages and addresses the critical issue of data contamination in multilingual evaluations.<\/p>\n\n\n\n

External collaboration with Masakhane NLP<\/h3>\n\n\n\n

MSR Africa has partnered with the Masakhane NLP community (opens in new tab)<\/span><\/a> to advance research and development in Machine Translation for the under-resourced African languages. Our joint efforts focus on enhancing accessibility, accuracy, and cultural relevance of language technologies across the continent. Below are highlights of our key projects:<\/p>\n\n\n\n

    \n
  1. AfriCOMET Development<\/strong>: Developed an innovative evaluation metric, AfriCOMET, and introduced AfroXLM-R, improving the evaluation and accuracy of translations for 13 under-resourced African languages (NAACL2024 Publication (opens in new tab)<\/span><\/a>).<\/li>\n\n\n\n
  2. Pre-Trained Models Adaptation<\/strong>: Created a new African news corpus to adapt pre-trained language models to 16 African languages, enhancing their utility with fine-tuned, high-quality translation data (NAACL2022 Publication (opens in new tab)<\/span><\/a>).<\/li>\n\n\n\n
  3. Oshiwambo Language Resources<\/strong>: Developed the largest Oshindonga-English parallel corpus to date, promoting both technological advancement and cultural preservation through enhanced machine translation capabilities (AfricaNLP workshop at ICLR2022 Publication) (opens in new tab)<\/span><\/a><\/li>\n<\/ol>\n\n\n\n
    <\/div>\n\n\n","protected":false},"excerpt":{"rendered":"

    Bridging technological and community divides by enriching Generative AI research with the rich diversity of African languages. Join us in shaping a more inclusive digital future Microsoft Research Africa is pioneering advancements in Generative AI through our African Languages Research initiative, focusing on advancing Large Language Models (LLMs) for African languages. Our goal is to […]<\/p>\n","protected":false},"featured_media":1073358,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"footnotes":""},"research-area":[13556,13554],"msr-locale":[268875],"msr-impact-theme":[],"msr-pillar":[],"class_list":["post-1068003","msr-project","type-msr-project","status-publish","has-post-thumbnail","hentry","msr-research-area-artificial-intelligence","msr-research-area-human-computer-interaction","msr-locale-en_us","msr-archive-status-active"],"msr_project_start":"2022-01-10","related-publications":[839002,848614,963351,1014642,1014651],"related-downloads":[],"related-videos":[],"related-groups":[],"related-events":[],"related-opportunities":[],"related-posts":[],"related-articles":[],"tab-content":[],"slides":[],"related-researchers":[{"type":"user_nicename","display_name":"Jacki O'Neill","user_id":32172,"people_section":"Section name 0","alias":"jaoneil"},{"type":"user_nicename","display_name":"Millicent Ochieng","user_id":40678,"people_section":"Section name 0","alias":"mochieng"}],"msr_research_lab":[1021599],"msr_impact_theme":[],"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/1068003"}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-project"}],"version-history":[{"count":7,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/1068003\/revisions"}],"predecessor-version":[{"id":1073427,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/1068003\/revisions\/1073427"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media\/1073358"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=1068003"}],"wp:term":[{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=1068003"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=1068003"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=1068003"},{"taxonomy":"msr-pillar","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-pillar?post=1068003"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}