This is the Trace Id: 6f8fc2a5de7d44d8853d84f4d7d4bf47

Bing EU Digital Services Act Transparency Report

This report is prepared semi-annually, beginning in October 2023, pursuant to the requirements of the EU Digital Services Act (DSA).

In accordance with the requirements of Regulation (EU) 2022/2065 (the EU Digital Services Act or DSA) for Very Large Online Search Engines, Bing provides the following report on content moderation activities engaged in during the period of July 1 – December 31, 2024.

DSA Article 15(1)(a): Government Orders from Member States

  • During the relevant period, Bing received zero orders from EU Member States’ authorities to act against illegal content provided by recipients of the service.
  • During the relevant period, Bing received zero orders from EU Member States’ authorities requesting specific information about individual recipients of the service.

  • As noted above, Bing received zero orders from EU Member States’ authorities requesting specific information about individual recipients of the service during the relevant period.

[1] “Recipients of the service” does not include the owners of websites indexed by the online search engine. DSA Recital 77.  Please see Microsoft's CSR Reports Hub for additional information on content removals regarding indexed website content.

DSA Article 15(1)(c): Own-Initiative Content Moderation

  • This section describes activities Bing undertakes to detect and address illegal content or information in violation of Bing’s terms and conditions that is provided by recipients of the service, and the activities Bing’s generative artificial intelligence (AI) features2 undertake to address recipient accounts in violation of their terms of use and code of conduct.3

    Use of AI-based classifiers on search prompts. Traditional web search begins with a user search query, namely the input (text, voice, or image) a user sends to Bing from the search bar. Similarly, during the relevant period, to initiate a search in Bing’s new AI-enhanced search experiences, a user submits text, voice, images, and/or other enabled queries as input to the AI model, which then performs the relevant searches of the Bing service and generates a response (or, in the case of Image Creator in Bing, generates an image). This is known as a “prompt”. Bing uses classifiers (machine learning models that help sort data into labeled classes or categories) and content filters on user search queries and prompts to help mitigate harm or prevent misuse. Examples include requests for information that could potentially lead to users being unexpectedly exposed to self-harm, violence, graphic content, hateful content, or misleading information. Flags from these classifiers may lead to mitigations, such as not returning generated content to the user, diverting the user to a different topic, or redirecting the user from AI-enhanced search to traditional web search. Bing tracks accuracy metrics for these measures such as precision, recall, error rate, under blocking and over blocking to help monitor the interventions’ effectiveness and help ensure Bing does not unduly limit free access to information. Accuracy metrics are overseen by human reviewers and based on Bing content policies.

    Automated content detection – Bing Visual Search. Bing’s Visual Search feature allows users to upload an image and search for similar images or ask questions about the image. As part of Microsoft’s longstanding commitment to preventing the spread of child sexual exploitation and abuse imagery (CSEAI), Bing uses hash matching technologies to detect matches of previously identified CSEAI. In the context of the immediate search, the use of these technologies furthers Bing’s goal to avoid inadvertently surfacing potentially harmful web content to users. More broadly, images that have been used as queries in Bing Visual Search may contribute to training Bing’s image-matching algorithms; by scanning images that users attempt to upload, Bing helps to ensure that CSEAI is not included in the Visual Search training data. Please see below for additional details about these processes.

    Bing’s generative AI features enforcement actions. In regard to Copilot in Bing and Image Creator in Bing features, Bing took action on recipient accounts where violations of Copilot AI Experiences Terms of Use4 are found, to either temporarily or permanently limit their access to be the same as an unauthenticated user,5 which has the same functionalities, but limited conversation turns. Image Creator in Bing takes actions on recipient accounts where violations of Image Creator Terms of Use6 are found, to either temporarily or permanently suspend their account access to image creation.

    Training and assistance. Human reviewers receive extensive training on our policies including the rationale behind them and how to apply them accurately and consistently. Decisions are periodically checked to ensure the policies are being applied consistently. Ongoing coaching and training are provided to review teams, as legal obligations evolve, new types of harms emerge, or policies otherwise need to adapt. For high-consequence harms, like child sexual exploitation and abuse, specialized teams receive additional focused training. Microsoft provides a program to support the mental and emotional wellbeing of Microsoft employees whose work may bring them into contact with objectional material. This program provides resources such as one on one counseling, monthly education sessions, on demand small group sessions, virtual community of practice gatherings, and access to program manager office hours. Microsoft requires our vendors to provide wellness programs for any vendor employees working with objectional material.

  • During the relevant period, Bing took voluntary actions to detect and block the upload of 162,608 items of suspected CSEAI content provided by recipients of the service. These items were identified through the use of automated content detection in Bing Visual Search as described above.
  • Bing does not provide capabilities for users to share content or interact with other users on the platform. As such, during the relevant period, Bing did not take measures that affected a recipient’s right to share content or interact with other users on the service due to illegal content or violations of terms and conditions. In the case of Bing’s generative AI features, there are some scenarios where a user may be suspended or blocked from using Bing’s generative AI features due to violations of the Terms of Use. Note that in Copilot in Bing7 (previously known as Bing Chat) and in Image Creator in Bing (previously known as Image Creator from Designer) there is still no ability for users to upload, post or share content on the service. Account restrictions are not related to actions affecting other users but instead relate to users attempting to generate content for their own use that violates relevant terms or codes of conduct.

    During the relevant period,8 Copilot in Bing temporarily restricted the recipient’s ability to interact with the product (thus limiting the number of turns in a conversation) in 24 instances and permanently restricted the recipient’s ability to interact with the product in 8 instances as a result of violations of Copilot AI Experiences Terms of Use and Code of Conduct which prevented the recipient from further exploitation of generative AI safety systems. For each of these instances the violation was due to “jailbreaks” - i.e., attempts to bypass safety systems.   

    During the relevant period, Image Creator in Bing (which includes Image Creator services in other Microsoft offerings including Microsoft Copilot, etc.) temporarily or permanently restricted access to 13,687 instances (thus limiting the recipient account’s access to Image Creator services in other Microsoft offerings including Microsoft Copilot, etc.) as a result of violations of the Image Creator Terms of Use. In the relevant period these actions were taken as a result of users attempting to bypass Image Creator in Bing safety systems.

  • [2]

    During the relevant period, Bing’s generative AI features included Copilot in Bing and Image Creator in Bing. Please see Copilot AI Experiences Terms of Use and Image Creator Terms of Use for additional information.

    On October 1st, Microsoft launched a separate consumer service known as Microsoft Copilot, which offers conversational experiences powered by generative AI. As such, Copilot in Bing data referenced below is for the time period July 1st to September 30th, 2024, until the product separation. 

    Bing continues to offer generative search features for users as an enhanced search experience.

  • [3] Due to their nature, Bing search and generative AI features do not generally conduct “content moderation” as that term is defined in the Digital Services Act due to the nature of those products, as content is not provided by recipients of the service nor hosted by Bing. Search queries, similar to user prompts, trigger systems that ensure the services work as intended. The outputs of these systems are not provided by a recipient of the service. Nevertheless, we have provided additional descriptions of how these systems operate.
  • [4] Please see Copilot AI Experiences Terms of Use for additional information.
  • [5] An unauthenticated user is defined as a user who is not logged into their Microsoft account when accessing Copilot features.
  • [6] Please see Image Creator Terms of Use for additional information.
  • [7] See Footnote 2.
  • [8] Following October 1st, Bing's generative AI features no longer provided experiences where account-level enforcement would be applicable. Please see Footnote 2 for more information.

DSA Article 15(1)(d): Appeals

  • During the relevant period:

    • Bing received zero appeals of the types of decisions described above;

    • Copilot in Bing received zero appeals of the type of decisions described above, with the separate Microsoft Copilot service launching on October 1st;

    • Image Creator in Bing (which includes Image Creator services in other Microsoft offerings including Microsoft Copilot, etc.) received 6,591 appeals9 of the type of decisions described above.
  • [9] The number of appeals for Image Creator in Bing is represented at the global level. A recipient whose account access to Image Creator in Bing features was suspended as a result of violations of the Image Creator Terms of Use may have appealed the suspension multiple times.
  •  

DSA Article 15(1)(e): Automated Content Detection

  • Automated content detection – Bing Visual Search. As described above, Bing relies on the hash matching technologies PhotoDNA and MD5 to detect imagery matching hashes obtained from the Internet Watch Foundation (“IWF”) and the US National Center for Missing and Exploited Children (“NCMEC”), which maintains its own databases and manages a database that enables industry hash sharing. When a user uses visual search, their image query is scanned and if it matches a hash of CSEAI imagery, the image is prevented from entering Microsoft systems, no search results are returned, and available information is reported to law enforcement. This is one element of Microsoft’s overall commitment to prevent the spread of CSEAI, as described more fully in its Digital Safety Content Report and other public announcements.

    Hash-matching technology works by using a mathematical algorithm to create a unique signature (known as a “hash”) for digital images and videos. The hashing technology then compares the hashes generated from content provided by the recipient of the service with hashes of reported (known) CSEAI, in a process called “hash matching”.

    A layered approach to detection of CSEAI is applied in this context, combining both hash-matching technology and manual review. Microsoft implements its own hash verification process in which Microsoft-trained analysts review and confirm images associated with hashes provided from non-profits and other industry partners. Microsoft also implements an additional manual review process as an ongoing hash quality check. Reversal rates of the initial content moderation decision (for example, on appeal) are tracked, as a reflection of Microsoft’s application of hash-matching technology.

DSA Article 42(3): Information on monthly active users

  • Information about the average monthly active users of the Bing service in the European Union is published semi-annually. The most recent information is available on the EU Digital Services Act information page and reports approximately 129 million average monthly users in the EU during the six-month period ending December 31, 2024. The table below details the monthly active users for each EU Member State during this period. Note that these numbers may include an overlap in recipients of the service who accessed Bing from multiple Member States during the relevant time period.

Average monthly active users (MAU) for each EU Member State

EU Member State Average MAU (million)
Austria
3.4
Belgium
4.5
Bulgaria
1.0
Croatia
0.8
Cyprus
0.3
Czech Republic
3.4
Denmark
2.3
Estonia
0.4
Finland
2.0
France
20.9
Germany
28.5
Greece
1.8
Hungary
2.1
Ireland
2.6
Italy
12.4
Latvia
0.5
Lithuania
0.8
Luxembourg
0.3
Malta
0.2
Netherlands
8.6
Poland
9.7
Portugal
3.1
Romania
2.4
Slovak Republic
1.1
Slovenia
0.6
Spain
12.0
Sweden
4.0

This information was compiled pursuant to the Digital Services Act and thus may differ from other user metrics published by Bing.

Follow Microsoft