{"id":1030,"date":"2022-12-21T16:33:22","date_gmt":"2022-12-21T16:33:22","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/startups\/blog\/?p=1030"},"modified":"2024-11-04T13:55:59","modified_gmt":"2024-11-04T21:55:59","slug":"metaverse-training-stellarx","status":"publish","type":"post","link":"https:\/\/www.microsoft.com\/en-us\/startups\/blog\/metaverse-training-stellarx\/","title":{"rendered":"Metaverse training and AI-driven onboarding – #LaunchWithAI"},"content":{"rendered":"\n

Communication inside the metaverse is immensely challenging. Getting the conversation right, detecting the right emotions and building empathy – how can a startup envision breaking the tech components and building an end-to-end experience? This week in #LaunchWithAI, we\u2019re talking to Harold Dumur<\/a>, founder and CEO of OVA<\/a>, a member of Microsoft for Startups Founders Hub about StellarX, a metaverse builder<\/a>.<\/strong><\/em><\/p>\n\n\n\n

What is StellarX?<\/h2>\n\n\n\n
\n

“Welcome. I’m Biz<\/strong>, your virtual (non-human) learning coach.”<\/p>\n<\/blockquote>\n\n\n\n

This introduction comes from an immersive environment created with OVA\u2019s intuitive Metaverse builder StellarX, in collaboration with Desjardins Lab<\/a>.<\/p>\n\n\n\n

The project merges AI with spatial computing, machine learning, and user interface technologies. We created an intuitive, immersive experience to teach and test critical communication skills in a finance and insurance customer service context.<\/p>\n\n\n\n

The call agents interact with a smart virtual assistant through scripted conversations based on real-world situations. The virtual trainer evolves contextually alongside the agents’ professional development.<\/p>\n\n\n

\n
\"Ova<\/figure><\/div>\n\n\n

How does contextual evolution work?<\/h2>\n\n\n\n

We dove into the Natural Language Processing (NLP) world by leveraging Azure Cognitive Services<\/a> and their NLP framework named LUIS (Language understanding)<\/a>. By leveraging the tools offered by Microsoft and their Bot Framework<\/a>, OVA was able to design a conversational flow that was later integrated into their metaverse building platform.<\/p>\n\n\n\n

\n

“The purpose of the virtual assistant is that it follows the call agent into real-world situations through real-time aid. For instance, the virtual assistant can monitor and analyze performance to give valuable feedback,” said Pierre-Luc Lapointe, R&D Director at OVA.<\/p>\n<\/blockquote>\n\n\n\n

AI-driven OVA has been building a patent-pending visual scripting tool for the last couple of years that is now integrated into StellarX. The goal is to let users co-create with an AI interaction. By combining this project\u2019s discoveries, Azure Cognitive, and our visual scripting tools, the team is aiming toward having a system that we call \u201cintent-to-code.\u201d This system, once deployed, will let users create interaction in the metaverse based on voice commands.<\/p>\n\n\n\n

This coaching evolution is thanks to the automatic learning algorithms that feed the virtual assistant. Therefore, it can explore experiential personalization, predictive analytics, and recommend products or services.<\/p>\n\n\n\n

How crucial is the AI emotional connection?<\/h2>\n\n\n\n

“I lost my credit card” sounds quite different from “I lost my credit card @#$%&!” Indeed, it is quite easy to identify the emotional intention behind a sentence. Here, the employee embodies an avatar in a virtual Space, and interacts with another avatar that belongs to the customer. A third avatar, the AI-coach, witnesses the session, gathering data for future feedback.<\/p>\n\n\n\n

In this context, a significant challenge was approaching real-life conversations by identifying emotions. So, how precise can you get when categorizing feelings?<\/p>\n\n\n\n

We needed to build an effective model to automatically label positive, neutral, and negative emotions in a customer-agent relationship context. We used Azure Cognitive Services for integration simplicity, reliability, and time-consuming savings. We also used Azure Cognitive services for its natural language understanding tools to research and develop both virtual assistant and intent-to-code features.<\/p>\n\n\n\n

The precision of the automatic labeling of emotions was based on two distinct contexts:<\/p>\n\n\n\n