Responsible AI Archives | Microsoft Copilot Blog Mon, 30 Sep 2024 19:35:07 +0000 en-US hourly 1 https://wordpress.org/?v=6.6.2 Transparency and Control in Consumer Data Use   http://approjects.co.za/?big=en-us/microsoft-copilot/blog/2024/08/16/transparency-and-control-in-consumer-data-use/ Fri, 16 Aug 2024 16:36:49 +0000 Since launching our new generative AI experiences in products like Copilot, we’ve received a tremendous amount of product feedback both on what you’ve enjoyed and what you want more of in the future. We have consistently heard that you expect the next wave of AI experiences to feel seamlessly personalized, effortlessly integrated, and coherent across

The post Transparency and Control in Consumer Data Use   appeared first on Microsoft Copilot Blog.

]]>
Since launching our new generative AI experiences in products like Copilot, we’ve received a tremendous amount of product feedback both on what you’ve enjoyed and what you want more of in the future. We have consistently heard that you expect the next wave of AI experiences to feel seamlessly personalized, effortlessly integrated, and coherent across platforms.  

We’re energized by this feedback and are committed to delivering incredible new AI experiences for everyone. Fundamental to this process is our full commitment to earning your trust through transparency, accountability, and user control. That’s why we’re sharing some upcoming changes in how we will use consumer data, and our approach to ensuring our users are always in control.  

  • We will soon start using consumer data from Copilot, Bing, and Microsoft Start (MSN) (including interactions with advertisements) to help train the generative AI models in Copilot. We are making this change as real-world consumer interactions provide greater breadth and diversity in training data. We believe this will help us build more inclusive, relevant products, and improve the experience for all users. For example, our AI models can learn from an aggregated set of thumbs up/thumbs down selections to provide better, more useful responses in the future. Our AI models can learn colloquial phrases or local references from Copilot conversations. Our AI models can also learn from Microsoft advertising segments what, broadly, is most interesting and compelling to consumers, helping the models deliver more relevant content in the future. 
  • We will also make it simple for consumers to opt-out of their data being used for training, with clear notices displayed in Copilot, Bing, and Microsoft Start. We will start providing these opt-out controls in October, and we won’t begin training our AI models on this data until at least 15 days after we notify consumers that the opt-out controls are available.  

In short, we will always ask first, and we’ll always put you in control. 

These changes will only apply to consumers who are signed into their Microsoft Account. Consumers will retain existing controls over their data, and there are no changes to the existing options to manage how their data is used to personalize Microsoft services for them. In addition, this change only applies to how we will start training our generative AI models in Copilot, and does not change existing uses of consumer data as outlined in the Microsoft Privacy Statement. 

We will gradually roll this out in different markets to ensure we get this right for consumers and to comply with privacy laws around the world. For example, we will not offer this setting or conduct training on consumer data from the European Economic Area (EEA) until further notice. To learn more, please see our FAQ here

When we begin training, we will continue to adhere to the following data protection commitments: 

  • Your data will not be used to identify you: Before training these AI models, we will remove information that may identify you, such as names, phone numbers, device or account identifiers, sensitive personal data, physical addresses, and email addresses. 
  • Your data will be kept private: Your data remains private when using our services and is not disclosed without your permission. Generative AI models do not store training data or return it to provide a response, and instead are designed to generate new content. We continuously evaluate our models for privacy and safety, including conducting testing and building filters that screen out previously published or used material. We will protect your personal data as explained in the Microsoft Privacy Statement and in compliance with privacy laws around the world. 
  • Data from minors will not be used for training: We will only train our generative AI models on data from consumers who tell us they are 18 years or older. 

There are no changes to how Microsoft manages commercial customers’ data. More information about how we manage commercial customer data can be found here

We are fully committed to earning your trust. We will remain steadfast in protecting your data and will be open about what we do with it, and why. If we plan for additional changes to how we use consumer data for training our generative AI models in Copilot, we will share that transparently and will ensure there remains an ability for consumers to stay in control and choose whether to allow that.  

Once again, we will always ask first, and we’ll always put you in control. 

The post Transparency and Control in Consumer Data Use   appeared first on Microsoft Copilot Blog.

]]>
Learn about Copilot prompts https://support.microsoft.com/en-us/topic/learn-about-copilot-prompts-f6c3b467-f07c-4db1-ae54-ffac96184dd5?ocid=copilotlab_smc_articlelearnabout Tue, 19 Sep 2023 15:17:00 +0000 Understand Copilot prompts as instructions directing AI behavior, comprising elements like goals, context, expectations, and sources.

The post Learn about Copilot prompts appeared first on Microsoft Copilot Blog.

]]>
Copilot prompts are instructions or questions you use to tell Copilot what you want. Prompts can include four parts: the goal, context, expectations, and source, as described in the following image:  

The visual representation of the prompt framework with examples: goal+context+tone+data

You can put a little or a lot into a prompt, but all that’s required is a clear goal. If you want to be more specific, add the other parts. You’ll often need to include more than a goal to get the results you want. Here’s an example prompt in Microsoft 365 chat, that includes a goal and source:  :

Write a summary based on all emails from Sam in the past two weeks. 

And here’s an example that includes a goal, context, and the expectations: 

Draft an outline of a training manual about time management. Our audience is professionals who work in a hybrid environment and constantly need to attend virtual meetings and meet deadlines. The tone of the document will be friendly and suggestive. 

Most likely, you’ll follow up on the results with another prompt. Expect some back-and-forth conversation to get the results you’re looking for. 

What can I get done with Copilot prompts? 

Copilot is built upon Large Language Models (LLMs) that are connected to your Microsoft 365 apps and data. With Copilot, you can go beyond what you can do with other LLM-powered chatbots by getting data from Microsoft 365 Apps and your internal data, such as articles, reports, emails, presentations, and more. With Copilot, you can create or edit content, ask questions, summarize information, and catch up on things. 

Catch up

To catch up on what happened in a meeting, you can ask Copilot in Teams, “What questions were asked during the meeting?” or “What ideas were presented?”  

Create

Want to create a presentation about time management? Try this prompt with Copilot in PowerPoint:  

Create a short presentation about time management. 

Want to draft a response to an email announcing a project launch? Try this prompt with Copilot in Outlook: 

Write an email to congratulate the project lead and team on the launch. 

Ask

Are you planning a trip? You can ask Copilot, “Give me ideas for a 3-day trip in Hawaii.”  

Or, if you’re a team leader who wants to get team members engaged, try asking Copilot, “Give me ideas for a team building activity.” 

Edit

In Word, you can ask Copilot to edit a paragraph for you by selecting the paragraph and choosing the Copilot icon to “Rewrite with Copilot.”  

Or you can polish a PowerPoint slide with a prompt like, “Add an image of a target with arrows.”  

For enterprise-licensed users, Copilot unlocks business value by connecting Large Language Models (LLMs) to internal business data. Business customers can use prompts like: 

  • Create a training course outline to onboard partners to Project X.
  • What’s the latest from Sam?
  • Generate a project kick-off presentation based on the topics discussed in the chat. 

There are numerous opportunities with Microsoft Copilot. Find more examples in the Copilot Lab, and adjust them to make them your own.  

A few points to keep in mind 

  • Review and verify responses you get from Copilot.  Copilot is built upon Large Language Models (LLMs), advanced tools designed to predict and generate text. Occasionally, Copilot responses can include incorrect content, due to the vast and diverse nature of LLMs.  Evaluate Copilot’s responses and cross-reference with trusted sources when needed.
  • Using the same prompt multiple times can result in different responses. As LLMs grow and new information is added, you can get different responses to a prompt you’ve used before.
  • Use Copilot in a respectful, ethical, and lawful manner. Avoid using Copilot for any purpose that might cause harm to yourself or others. See our responsible AI (Artificial Intelligence) principles and standards.

The post Learn about Copilot prompts appeared first on Microsoft Copilot Blog.

]]>