25 April 2024
25 April 2024
Welcome to our European Union (EU) transparency page, where we publish EU specific information required by the Digital Services Act (DSA), the Audiovisual Media Service Directive (AVMSD), the Dutch Media Act (DMA) and the Terrorist Content Online Regulation (TCO). Please note that the most up-to-date version of these Transparency Reports can be found in the en-US locale.
Snap Group Limited has appointed Snap B.V. as its Legal Representative for purposes of the DSA. You can contact the representative at dsa-enquiries [at] snapchat.com for the DSA, at vsp-enquiries [at] snapchat.com for AVMSD and DMA at tco-enquiries [at] snapchat.com for TCO, through our Support Site [here] or at:
Snap B.V.
Keizersgracht 165, 1016 DP
Amsterdam, The Netherlands
If you are a law enforcement agency, please follow the steps outlined here.
Please communicate in English or Dutch when contacting us.
For DSA, we are regulated by the European Commission, and the Netherlands Authority for Consumers and Markets (ACM). For AVMSD and the DMA, we are regulated by the Dutch Media Authority (CvdM). For TCO, we are regulated by the Netherlands Authority for the prevention of Online Terrorist Content and Child Sexual Abuse Material (ATKM).
Snap is required by Articles 15, 24 and 42 of the DSA to publish reports containing prescribed information regarding Snap's content moderation for Snapchat's services that are considered "online platforms", i.e. Spotlight, For You, Public Profiles, Maps, Lenses and Advertising. This report must be published every 6 months, from 25 October 2023.
Snap publishes transparency reports twice a year to provide insight into Snap's safety efforts and the nature and volume of content reported on our platforms. Our latest report for H2 2023 (1 July – 31 December) can be found here (with updates to our Average Monthly Active Recipient figures as of 1 August 2024 – see bottom of this page). Metrics specific to the Digital Services Act, can be found on this page.
As of 31 December 2023, we have 90.9 million average monthly active recipients (“AMAR”) of our Snapchat app in the EU. This means that, on average over the last six months, 90.9 million registered users in the EU have opened the Snapchat app at least once during a given month.
This figure breaks down by Member State as follows:
These figures were calculated to meet current DSA rules and should only be relied on for DSA purposes. We have changed how we calculate this figure over time, including in response to changing internal policy, regulator guidance and technology, and figures are not intended to be compared between periods. This may also differ from the calculations used for other active user figures we publish for other purposes.
Takedown Requests
During this period, we have received 0 takedown requests from EU member states pursuant to DSA Article 9.
Information Requests
During this period, we have received the following information requests from EU member states:
The median turnaround time to inform authorities of receipt of Information Requests is 0 minutes – we provide an automated response confirming receipt. The median turnaround time to give effect to Information Requests is ~10 days. This metric reflects the time period from when Snap receives an IR to when Snap believes the request is fully resolved. In some cases, the length of this process depends in part on the speed with which law enforcement responds to any requests for clarification from Snap necessary to process their request.
All content on Snapchat must adhere to our Community Guidelines and Terms of Service, as well as supporting terms, guidelines and explainers. Proactive detection mechanisms and reports of illegal or violating content or accounts prompt a review, at which point, our tooling systems process the request, gather relevant metadata, and route the relevant content to our moderation team via a structured user interface that is designed to facilitate effective and efficient review operations. When our moderation teams determine, either through human review or automated means that a user has violated our Terms, we may remove the offending content or account, terminate or limit the visibility of the relevant account, and/or notify law enforcement as explained in our Snapchat Moderation, Enforcement, and Appeals Explainer. Users whose accounts are locked by our safety team for Community Guidelines violations can submit a locked account appeal, and users can appeal certain content enforcements.
Content and Account Notices (DSA Article 15.1(b))
Snap has put into place mechanisms to allow users and non-users to notify Snap of content and accounts violating our Community Guidelines and Terms of Service on the platform, including those they consider illegal pursuant to DSA Article 16. These reporting mechanisms are available in the app itself (i.e. directly from the piece of content) and on our website.
During the relevant period, we received the following content and account notices in the EU:
In H2’23, we handled 664,896 notices solely via automated means. All of these were enforced against our Community Guidelines because our Community Guidelines encapsulate illegal content.
In addition to user-generated content and accounts, we moderate advertisements if they violate our platform policies. Below are the total ads that were reported and removed in the EU.
Trusted Flaggers Notices (Article 15.1(b))
For the period of our latest Transparency Report (H2 2023), there were no formally appointed Trusted Flaggers under the DSA. As a result, the number of notices submitted by such Trusted Flaggers was zero (0) in this period.
Proactive Content Moderation (Article 15.1(c))
During the relevant period, Snap enforced the following content and accounts in the EU after engaging content moderation at its own initiative:
All of Snap’s own-initiative moderation efforts leveraged humans or automation. On our public content surfaces, content generally goes through both auto-moderation and human review before it is eligible for distribution to a wide audience. With regards to automated tools, these include:
Proactive detection of illegal and violating content using machine learning;
Hash-matching tools (such as PhotoDNA and Google's CSA Match);
Abusive Language Detection to reject content based on an identified and regularly updated list of abusive key words, including emojis
Appeals (Article 15.1(d))
During the relevant period, Snap processed the following content and account appeals in the EU via its internal complaint-handling systems: