European Union
1 July 2023 – 31 December 2023

Released:

25 April 2024

Updated:

25 April 2024

Welcome to our European Union (EU) transparency page, where we publish EU specific information required by the Digital Services Act (DSA), the Audiovisual Media Service Directive (AVMSD), the Dutch Media Act (DMA) and the Terrorist Content Online Regulation (TCO). Please note that the most up-to-date version of these Transparency Reports can be found in the en-US locale.

Legal Representative 

Snap Group Limited has appointed Snap B.V. as its Legal Representative for purposes of the DSA. You can contact the representative at dsa-enquiries [at] snapchat.com for the DSA, at vsp-enquiries [at] snapchat.com for AVMSD and DMA at tco-enquiries [at] snapchat.com for TCO, through our Support Site [here] or at:

Snap B.V.
Keizersgracht 165, 1016 DP
Amsterdam, The Netherlands

If you are a law enforcement agency, please follow the steps outlined here.

Please communicate in English or Dutch when contacting us.

Regulatory Authorities

For DSA, we are regulated by the European Commission, and the Netherlands Authority for Consumers and Markets (ACM). For AVMSD and the DMA, we are regulated by the Dutch Media Authority (CvdM). For TCO, we are regulated by the Netherlands Authority for the prevention of Online Terrorist Content and Child Sexual Abuse Material (ATKM).

DSA Transparency Report

Snap is required by Articles 15, 24 and 42 of the DSA to publish reports containing prescribed information regarding Snap's content moderation for Snapchat's services that are considered "online platforms", i.e. Spotlight, For You, Public Profiles, Maps, Lenses and Advertising. This report must be published every 6 months, from 25 October 2023.

Snap publishes transparency reports twice a year to provide insight into Snap's safety efforts and the nature and volume of content reported on our platforms. Our latest report for H2 2023 (1 July – 31 December) can be found here (with updates to our Average Monthly Active Recipient figures as of 1 August 2024 – see bottom of this page). Metrics specific to the Digital Services Act, can be found on this page.

Average Monthly Active Recipients 
(DSA Articles 24.2 and 42.3)

As of 31 December 2023, we have 90.9 million average monthly active recipients (“AMAR”) of our Snapchat app in the EU. This means that, on average over the last six months, 90.9 million registered users in the EU have opened the Snapchat app at least once during a given month.

This figure breaks down by Member State as follows:

These figures were calculated to meet current DSA rules and should only be relied on for DSA purposes. We have changed how we calculate this figure over time, including in response to changing internal policy, regulator guidance and technology, and figures are not intended to be compared between periods. This may also differ from the calculations used for other active user figures we publish for other purposes.


Member States Authority Requests
(DSA Article 15.1(a))

Takedown Requests 

During this period, we have received 0 takedown requests from EU member states pursuant to DSA Article 9. 

Information Requests 

During this period, we have received the following information requests from EU member states:

The median turnaround time to inform authorities of receipt of Information Requests is 0 minutes – we provide an automated response confirming receipt. The median turnaround time to give effect to Information Requests is ~10 days. This metric reflects the time period from when Snap receives an IR to when Snap believes the request is fully resolved. In some cases, the length of this process depends in part on the speed with which law enforcement responds to any requests for clarification from Snap necessary to process their request.

Content Moderation 


All content on Snapchat must adhere to our Community Guidelines and Terms of Service, as well as supporting terms, guidelines and explainers. Proactive detection mechanisms and reports of illegal or violating content or accounts prompt a review, at which point, our tooling systems process the request, gather relevant metadata, and route the relevant content to our moderation team via a structured user interface that is designed to facilitate effective and efficient review operations. When our moderation teams determine, either through human review or automated means that a user has violated our Terms, we may remove the offending content or account, terminate or limit the visibility of the relevant account, and/or notify law enforcement as explained in our Snapchat Moderation, Enforcement, and Appeals Explainer.  Users whose accounts are locked by our safety team for Community Guidelines violations can submit a locked account appeal, and users can appeal certain content enforcements.

Content and Account Notices (DSA Article 15.1(b))

Snap has put into place mechanisms to allow users and non-users to notify Snap of content and accounts violating our Community Guidelines and Terms of Service on the platform, including those they consider illegal pursuant to DSA Article 16.  These reporting mechanisms are available in the app itself (i.e. directly from the piece of content) and on our website.

During the relevant period, we received the following content and account notices in the EU:

In H2’23, we handled 664,896 notices solely via automated means. All of these were enforced against our Community Guidelines because our Community Guidelines encapsulate illegal content. 

In addition to user-generated content and accounts, we moderate advertisements if they violate our platform policies. Below are the total ads that were reported and removed in the EU. 

Trusted Flaggers Notices (Article 15.1(b))

For the period of our latest Transparency Report (H2 2023), there were no formally appointed Trusted Flaggers under the DSA. As a result, the number of notices submitted by such Trusted Flaggers was zero (0) in this period.

Proactive Content Moderation (Article 15.1(c))

During the relevant period, Snap enforced the following content and accounts in the EU after engaging content moderation at its own initiative:

All of Snap’s own-initiative moderation efforts leveraged humans or automation. On our public content surfaces, content generally goes through both auto-moderation and human review before it is eligible for distribution to a wide audience. With regards to automated tools, these include:

  • Proactive detection of illegal and violating content using machine learning;

  • Hash-matching tools (such as PhotoDNA and Google's CSA Match);

  • Abusive Language Detection to reject content based on an identified and regularly updated list of abusive key words, including emojis


Appeals (Article 15.1(d))

During the relevant period, Snap processed the following content and account appeals in the EU via its internal complaint-handling systems:


* Stopping child sexual exploitation is a top priority. Snap devotes significant resources toward this and has zero tolerance for such conduct.  Special training is required to review CSE appeals, and there is a limited team of agents who handle these reviews due to the graphic nature of the content.  During the fall of 2023, Snap implemented policy changes that affected the consistency of certain CSE enforcements, and we have addressed these inconsistencies through agent re-training and rigorous quality assurance. We expect that the next transparency report will reveal progress toward improving response times for CSE appeals and improving the precision of initial enforcements. 

Automated means for content moderation (Article 15.1(e))

On our public content surfaces, content generally goes through both auto-moderation and human review before it is eligible for distribution to a wide audience. With regards to automated tools, these include:

  • Proactive detection of illegal and violating content using machine learning;

  • Hash-matching tools (like PhotoDNA and Google's CSAI Match);

  • Abusive Language Detection to reject content based on an identified and regularly updated list of abusive keywords, including emojis.


The accuracy of automated moderation technologies for all harms was approximately 96.61% and the error rate was approximately 3.39%.


Content Moderation Safeguards (Articles 15.1(e))

We recognise there are risks associated with content moderation, including risks to freedoms of expression and assembly that may be caused by automated and human moderator bias and abusive reports including by governments, political constituencies, or well-organised individuals. Snapchat is generally not a place for political or activist content, particularly in our public spaces. 


Nevertheless, to safeguard against these risks, Snap has testing and training in place and has robust, consistent procedures for handling reports of illegal or violating content including from law enforcement and government authorities. We continually evaluate and evolve our content moderation algorithms. While potential harms to freedom of expression are difficult to detect, we are not aware of any significant issues and we provide avenues for our users to report mistakes if they occur. 


Our policies and systems promote consistent and fair enforcement and, as described above, provide Snapchatters an opportunity to meaningfully dispute enforcement outcomes through Notice and Appeals processes that aim to safeguard the interests of our community while protecting individual Snapchatter rights.

We continually strive to improve our enforcement policies and processes and have made great strides in combating potentially harmful and illegal content and activities on Snapchat. This is reflected in an upward trend in our reporting and enforcement figures shown in our latest Transparency Report and decreasing prevalence rates for violations on Snapchat overall.


Out of Court Settlements (Article 24.1(a))

For the period of our latest Transparency Report (H2 2023), there were no formally appointed out-of-court dispute settlement bodies under the DSA. As a result, the number of disputes submitted to such bodies was zero (0) in this period, and we are unable to provide outcomes, median turnaround times for settlements and the share of disputes where we implemented the decisions of the body. 



Account Suspensions (Article 24.1(b))

During H2 2023,  we did not have any account suspensions imposed pursuant to Article 23. Snap’s Trust & Safety team has procedures in place to limit the possibility of user accounts frequently submitting notices or complaints that are manifestly unfounded. These procedures include restricting duplicative report creation and the use of email filters to prevent users who have frequently submitted manifestly unfounded reports from continuing to do so. Snap takes appropriate enforcement action against accounts as explained in our Snapchat Moderation, Enforcement and Appeals Explainer, and information regarding the level of Snap’s account enforcement can be found in our Transparency Report (H2 2023). Such measures will continue to be reviewed and iterated.


Moderator Resources, Expertise and Support (Article 42.2)

Our content moderation team operates across the globe, enabling us to help keep Snapchatters safe 24/7. Below, you will find the breakdown of our human moderation resources by the language specialities of moderators (note that some moderators specialise in multiple languages) as of 31 December 2023:

The above table includes all moderators who support EU member state languages as of 31 December 2023. In situations where we need additional language support, we use translation services.

Moderators are recruited using a standard job description that includes a language requirement (depending on the need). The language requirement states that the candidate should be able to demonstrate written and spoken fluency in the language and have at least one year of work experience for entry-level positions. Candidates must meet the educational and background requirements in order to be considered. Candidates also must demonstrate an understanding of current events for the country or region of content moderation they will support.

Our moderation team applies our policies and enforcement measures to help protect our Snapchat community. Training is conducted over a multi-week period, in which new team members are educated on Snap’s policies, tools, and escalations procedures. After the training, each moderator must pass a certification exam before being permitted to review content. Our moderation team regularly participates in refresher training relevant to their workflows, particularly when we encounter policy-borderline and context-dependent cases. We also run upskilling programmes, certification sessions, and quizzes to ensure all moderators are current and in compliance with all updated policies. Finally, when urgent content trends surface based on current events, we quickly disseminate policy clarifications so teams are able to respond according to Snap’s policies.

We provide our content moderation team – Snap’s “digital first responders” – with significant support and resources, including on-the-job wellness support and easy access to mental health services. 

Child Sexual Exploitation and Abuse (CSEA) Media Scanning Report


Background

The sexual exploitation of any member of our community, especially minors, is illegal, abhorrent, and prohibited by our Community Guidelines. Preventing, detecting, and eradicating Child Sexual Exploitation and Abuse (CSEA) on our platform is a top priority for Snap, and we continually evolve our capabilities to combat these and other crimes.


We use PhotoDNA robust hash-matching and Google’s Child Sexual Abuse Imagery (CSAI) Match to identify known illegal images and videos of child sexual abuse, respectively, and report them to the U.S. National Center for Missing and Exploited Children (NCMEC), as required by law. NCMEC then, in turn, coordinates with domestic or international law enforcement, as required.


Report

The below data is based on the result of proactive scanning using PhotoDNA and/or CSAI Match of media uploaded by a user’s camera roll to Snapchat.

Stopping child sexual exploitation is a top priority. Snap devotes significant resources toward this and has zero tolerance for such conduct.  Special training is required to review CSE appeals, and there is a limited team of agents who handle these reviews due to the graphic nature of the content.  During the fall of 2023, Snap implemented policy changes that affected the consistency of certain CSE enforcements, and we have addressed these inconsistencies through agent re-training and rigorous quality assurance. We expect that the next transparency report will reveal progress toward improving response times for CSE appeals and improving the precision of initial enforcements.  

Content Moderation Safeguards

The safeguards applied for CSEA Media Scanning are set out in the above “Content Moderation Safeguards” section under our DSA Report.


Child Sexual Exploitation and Abuse (CSEA) Media Scanning Report


Background

The sexual exploitation of any member of our community, especially minors, is illegal, abhorrent, and prohibited by our Community Guidelines. Preventing, detecting, and eradicating Child Sexual Exploitation and Abuse (CSEA) on our platform is a top priority for Snap, and we continually evolve our capabilities to combat these and other crimes.


We use PhotoDNA robust hash-matching and Google’s Child Sexual Abuse Imagery (CSAI) Match to identify known illegal images and videos of child sexual abuse, respectively, and report them to the U.S. National Center for Missing and Exploited Children (NCMEC), as required by law. NCMEC then, in turn, coordinates with domestic or international law enforcement, as required.


Report

The below data is based on the result of proactive scanning using PhotoDNA and/or CSAI Match of media uploaded by a user’s camera roll to Snapchat.

Stopping child sexual exploitation is a top priority. Snap devotes significant resources toward this and has zero tolerance for such conduct.  Special training is required to review CSE appeals, and there is a limited team of agents who handle these reviews due to the graphic nature of the content.  During the fall of 2023, Snap implemented policy changes that affected the consistency of certain CSE enforcements, and we have addressed these inconsistencies through agent re-training and rigorous quality assurance. We expect that the next transparency report will reveal progress toward improving response times for CSE appeals and improving the precision of initial enforcements.  

Content Moderation Safeguards

The safeguards applied for CSEA Media Scanning are set out in the above “Content Moderation Safeguards” section under our DSA Report.


Stopping child sexual exploitation is a top priority. Snap devotes significant resources toward this and has zero tolerance for such conduct.  Special training is required to review CSE appeals, and there is a limited team of agents who handle these reviews due to the graphic nature of the content.  During the fall of 2023, Snap implemented policy changes that affected the consistency of certain CSE enforcements, and we have addressed these inconsistencies through agent re-training and rigorous quality assurance. We expect that the next transparency report will reveal progress toward improving response times for CSE appeals and improving the precision of initial enforcements.  

Content Moderation Safeguards

The safeguards applied for CSEA Media Scanning are set out in the above “Content Moderation Safeguards” section under our DSA Report.


EU DSA: Average Monthly Active Recipients (August 2024)
(DSA Articles 24.2 and 42.3)

As of 1 August 2024, we have 92.4 million average monthly active recipients (“AMAR”) of our Snapchat app in the EU. This means that, on average over the last six months, 92.4 million registered users in the EU have opened the Snapchat app at least once during a given month.

This figure breaks down by Member State as follows: