25 October 2024
29 November, 2024
Welcome to our European Union (EU) transparency page, where we publish EU specific information required by the Digital Services Act (DSA), the Audiovisual Media Service Directive (AVMSD), the Dutch Media Act (DMA), and the Terrorist Content Online Regulation (TCO). Please note that the most up-to-date version of these Transparency Reports can be found in the en-US locale.
Snap Group Limited has appointed Snap B.V. as its Legal Representative for purposes of the DSA. You can contact the representative at dsa-enquiries [at] snapchat.com for the DSA, at vsp-enquiries [at] snapchat.com for AVMSD and DMA, at tco-enquiries [at] snapchat.com for TCO, through our Support Site [here], or at:
Snap B.V.
Keizersgracht 165, 1016 DP
Amsterdam, The Netherlands
If you are a law enforcement agency, please follow the steps outlined here.
Please communicate in English or Dutch when contacting us.
For DSA, we are regulated by the European Commission, and the Netherlands Authority for Consumers and Markets (ACM). For AVMSD and the DMA, we are regulated by the Dutch Media Authority (CvdM). For TCO, we are regulated by the Netherlands Authority for the prevention of Online Terrorist Content and Child Sexual Abuse Material (ATKM).
Last Updated: 25 October 2024
We publish this report regarding our content moderation efforts on Snapchat in accordance with the transparency reporting requirements provided in Articles 15, 24 and 42 of the European Union (EU)’s Digital Services Act (Regulation (EU) 2022/2065) (“DSA”). Except where otherwise noted, the information contained in this report is for the reporting period from 1 January 2024 to 30 June 2024 (H1 2024) and covers content moderation on the features of Snapchat that are regulated by the DSA.
We continually strive to improve our reporting. For this reporting period (H1 2024), we have made changes to the structure of our report with new and more differentiated tables to provide improved insight into our content moderation efforts.
As of 1 October 2024, we have 92.9 million average monthly active recipients (“AMAR”) of our Snapchat app in the EU. This means that, on average over the 6-month period ending 30 September 2024, 92.9 million registered users in the EU have opened the Snapchat app at least once during a given month.
This figure breaks down by Member State as follows:
These figures were calculated to meet current DSA requirements and should only be relied on for DSA purposes. We have changed how we calculate this figure over time, including in response to changing internal policy, regulator guidance and technology, and figures are not intended to be compared between periods. This may also differ from the calculations used for other active user figures we publish for other purposes.
2. Member State Authority Requests
During this reporting period (H1 2024), we received zero (0) orders to act against specifically identified pieces of illegal content from EU Member States’ authorities, including those issued in accordance with DSA Article 9.
Because this number is zero (0), we cannot provide a breakdown per type of illegal content concerned or Member State issuing the order, or the median times to acknowledge receipt or give effect to the orders.
During this reporting period (H1 2024), we received the following orders to disclose user data from EU Member States’ authorities, including those issued in accordance with DSA Article 10:
The median time to inform relevant authorities of the receipt of these orders to provide information was 0 minutes — we provide an automated response confirming receipt.
The median time to give effect to these orders to provide information was ~7 days. This metric reflects the time period from when Snap received an order to when Snap considered the matter to be fully resolved, which in individual cases may depend in part on the speed with which the relevant Member State authority responds to any requests for clarification from Snap necessary to process the order.
Note, we do not provide a breakdown of the above orders to provide information categorized by type of illegal content concerned because this information is not generally available to us.
All content on Snapchat must adhere to our Community Guidelines and Terms of Service. Certain content must also adhere to additional guidelines and policies. For example, content submitted for algorithmic recommendation to a wider audience on our public broadcast surfaces must meet the additional, higher standards provided in our Content Guidelines for Recommendation Eligibility, while advertisements must comply with our Advertising Policies.
We enforce these policies using technology and human review. We also provide mechanisms for Snapchatters to report violations, including illegal content and activities, directly in-app or through our website. Proactive detection mechanisms and reports prompt a review, which then leverages a mix of automated tools and human moderators to take appropriate action in accordance with our policies.
We provide further information about our content moderation on our public surfaces in H1 2024 below.
In accordance with DSA Article 16, Snap has put into place mechanisms enabling users and non-users to notify Snap of the presence on Snapchat of specific items of information they consider to be illegal content. They can do so by reporting specific pieces of content or accounts, either directly in the Snapchat app or on our website.
During the reporting period (H1 2024), we received the following notices submitted in accordance with DSA Article 16 in the EU:
Below, we provide a breakdown reflecting how these notices were processed – i.e., through a process including human review or solely via automated means:
In submitting notices in-app or through our website, reporters can select a specific reporting reason from a menu of options reflecting the categories of violations listed in our Community Guidelines (e.g., hate speech, drug use or sales). Our Community Guidelines prohibit content and activities that are illegal in the EU, so our reporting reasons largely reflect specific categories of illegal content in the EU. However, to the extent a reporter in the EU believes the content or account they are reporting is illegal for reasons not specifically referenced in our reporting menu, they are able to report it for “other illegal content” and are given an opportunity to explain why they believe what they are reporting is illegal.
If, upon review, we determine that reported content or account violates our Community Guidelines (including for reasons of illegality), we may (i) remove the offending content, (ii) warn the relevant account holder and apply a strike against the account, and / or (iii) lock the relevant account, as further explained in our Snapchat Moderation, Enforcement, and Appeals Explainer.
In H1 2024, we took the following enforcement actions upon receipt of notices submitted in accordance with DSA Article 16 in the EU:
In H1 2024, all reports for “other illegal content” that we actioned were ultimately enforced under our Community Guidelines because our Community Guidelines prohibited the relevant content or activity. We thus categorized these enforcements under the relevant category of Community Guideline violation in the table above.
In addition to the above enforcements, we may take action on content notified to us in accordance with other applicable Snap policies and guidelines:
With respect to content on our public broadcast surfaces, if we determine that the reported content does not meet the higher standards of our Content Guidelines for Recommendation Eligibility, we may reject the content for algorithmic recommendation (if the content does not meet our eligibility criteria), or we may limit the distribution of the content to exclude sensitive audiences (if the content meets our eligibility criteria for recommendation but is otherwise sensitive or suggestive).
In H1 2024, we took the following actions regarding content on Snapchat’s public broadcast surfaces reported to us in the EU, consistent with our Content Guidelines for Recommendation Eligibility: