25 October, 2023
07 February, 2024
Welcome to our European Union (EU) transparency page, where we publish EU specific information required by the EU Digital Services Act (DSA), the Audiovisual Media Service Directive (AVMSD) and the Dutch Media Act (DMA).
As at 1 August 2023, we have 102 million average monthly active recipients (“AMAR”) of our Snapchat app in the EU. This means that, on average over the last 6 months, 102 million registered users in the EU have opened the Snapchat app at least once during a given month.
This figure breaks down by Member State as follows:
These figures were calculated to meet current DSA rules and should only be relied on for DSA purposes. We may change how we calculate this figure over time, including in response to changing regulator guidance and technology. This may also differ from the calculations used for other active user figures we publish for other purposes.
Snap Group Limited has appointed Snap B.V. as its Legal Representative. You can contact the representative at dsa-enquiries [at] snapchat.com for the DSA, at vsp-enquiries [at] snapchat.com for AVMSD and DMA, through our Support Site [here], or at:
Snap B.V.
Keizersgracht 165, 1016 DP
Amsterdam, The Netherlands
If you are a law enforcement agency, please follow the steps outlined here.
For DSA, we are regulated by the European Commission, and the Netherlands Authority for Consumers and Markets (ACM).
For AVMSD and the DMA, we are regulated by the Dutch Media Authority (CvdM).
Snap is required by Articles 15, 24 and 42 of the DSA to publish reports containing prescribed information regarding Snap’s content moderation for Snapchat’s services that are considered “online platforms,” i.e., Spotlight, For You, Public Profiles, Maps, Lenses and Advertising. This report must be published every 6 months, from 25 October 2023.
Snap publishes transparency reports twice a year to provide insight into Snap’s safety efforts and the nature and volume of content reported on our platform. Our latest report for H1 2023 (January 1 - June 30) can be found here. That report contains the following information:
Government requests, which includes information and content removal requests;
Content violations, which includes action taken in relation to illegal content and median response time;
Appeals, which are received and handled through our internal complaints handling process.
Those sections are relevant to the information required by Article 15.1(a), (b) and (d) of the DSA. Note that they do not yet contain a full data-set because the latest report covers H1 2023, which predates the DSA’s entry into force.
We provide below some additional information on aspects not covered by our transparency report for H1 2023:
Content Moderation (Article 15.1(c) and (e), Article 42.2)
All content on Snapchat must adhere to our Community Guidelines and Terms of Service, as well as supporting terms, guidelines and explainers. Proactive detection mechanisms and reports of illegal or violating content or accounts prompt a review, at which point, our tooling systems process the request, gather relevant metadata, and route the relevant content to our moderation team via a structured user interface that is designed to facilitate effective and efficient review operations. When our moderation teams determine, either through human review or automated means, that a user has violated our Terms, we may remove the offending content or account, terminate or limit the visibility of the relevant account, and/or notify law enforcement as explained in our Snapchat Moderation, Enforcement, and Appeals Explainer. Users whose accounts are locked by our safety team for Community Guidelines violations can submit a locked account appeal, and users can appeal certain content enforcements.
Automated content moderation tools
On our public content surfaces, content generally goes through both auto-moderation and human review before it is eligible for distribution to a wide audience. With regards to automated tools, these include:
Proactive detection of illegal and violating content using machine learning;
Hash-matching tools (like PhotoDNA and Google’s CSAI Match);
Abusive Language Detection to reject content based on an identified and regularly updated list of abusive key words, including emojis.
For the period of our latest Transparency Report (H1 2023), there was no requirement to collate formal indicators / error rates for these automated systems. However, we regularly monitor these systems for issues and our human moderation decisions are regularly assessed for accuracy.
Human moderation
Our content moderation team operates across the globe, enabling us to help keep Snapchatters safe 24/7. Below, you will find the breakdown of our human moderation resources by the language specialties of moderators (note that some moderators specialize in multiple languages) as of August 2023:
The above numbers fluctuate frequently as we see incoming volume trends or submissions by language/country. In situations where we need additional language support, we use translation services.
Moderators are recruited using a standard job description that includes a language requirement (depending on the need). The language requirement states that the candidate should be able to demonstrate written and spoken fluency in the language and have at least one year of work experience for entry-level positions. Candidates must also meet the educational and background requirements in order to be considered. Candidates also must demonstrate an understanding of current events for the country or region of content moderation they will support.
Our moderation team applies our policies and enforcement measures to help protect our Snapchat community. Training is conducted over a multi-week period, in which new team members are educated on Snap’s policies, tools, and escalations procedures. After the training, each moderator must pass a certification exam before being permitted to review content. Our moderation team regularly participates in refresher training relevant to their workflows, particularly when we encounter policy-borderline and context-dependent cases. We also run upskilling programs, certification sessions, and quizzes to ensure all moderators are current and in compliance with all updated policies. Finally, when urgent content trends surface based on current events, we quickly disseminate policy clarifications so teams are able to respond according to Snap’s policies.
We provide our content moderation team – Snap’s “digital first responders” – with significant support and resources, including on-the-job wellness support and easy access to mental health services.
Content moderation safeguards
We recognise there are risks associated with content moderation, including risks to freedoms of expression and assembly that may be caused by automated and human moderator bias and abusive reports including by governments, political constituencies, or well-organized individuals. Snapchat is generally not a place for political or activist content, particularly in our public spaces.
Nevertheless, to safeguard against these risks, Snap has testing and training in place and has robust, consistent procedures for handling reports of illegal or violating content including from law enforcement and government authorities. We continually evaluate and evolve our content moderation algorithms. While potential harms to freedom of expression are difficult to detect, we are not aware of any significant issues and we provide avenues for our users to report mistakes if they occur.
Our policies and systems promote consistent and fair enforcement and, as described above, provide Snapchatters an opportunity to meaningfully dispute enforcement outcomes through Notice and Appeals processes that aim to safeguard the interests of our community while protecting individual Snapchatter rights.
We continually strive to improve our enforcement policies and processes and have made great strides in combating potentially harmful and illegal content and activities on Snapchat. This is reflected in an upward trend in our reporting and enforcement figures shown in our latest Transparency Report and decreasing prevalence rates for violations on Snapchat overall.
Trusted Flaggers Notices (Article 15.1(b))
For the period of our latest Transparency Report (H1 2023), there were no formally appointed Trusted Flaggers under the DSA. As a result, the number of notices submitted by such Trusted Flaggers was zero (0) in this period.
Out-of-court Disputes (Article 24.1(a))
For the period of our latest Transparency Report (H1 2023), there were no formally appointed out-of-court dispute settlement bodies under the DSA. As a result, the number of disputes submitted to such bodies was zero (0) in this period.
Account Suspensions Pursuant to Article 23 (Article 24.1(b))
For the period of our latest Transparency Report (H1 2023), there was no requirement to suspend accounts pursuant to Article 23 of the DSA for provision of manifestly illegal content, unfounded notices or unfounded complaints. As a result, the number of such suspensions was zero (0). However, Snap takes appropriate enforcement action against accounts as explained in our Snapchat Moderation, Enforcement, and Appeals Explainer) and information regarding the level of Snap’s account enforcement can be found in our Transparency Report (H1 2023).