Privacy, Safety, and Policy Hub
Transparency Report
July 1, 2023 – December 31, 2023

Released:

April 25, 2024

Updated:

April 25, 2024

To further provide insight into Snap’s safety efforts and the nature and volume of content reported on our platform, we publish this transparency report twice a year. We are committed to continuing to make these reports more comprehensive and informative to the safety and well-being of our community, and the many stakeholders who care deeply about our content moderation and law enforcement practices. 

This Transparency Report covers the second half of 2023 (July 1 - December 31). As with our previous reports, we share data about the global number of in-app content and account-level reports we received and enforced across specific categories of policy violations; how we responded to requests from law enforcement and governments; and our enforcement actions broken down by country.

As part of our ongoing commitment to continually improve our transparency reports, we are introducing a few new elements with this release. 

First, we have expanded our main table to include reports and enforcement against content and accounts tied to both Terrorism & Violent Extremism and Child Sexual Exploitation & Abuse (CSEA). In previous reports, we had highlighted account deletions made in response to those violations in separate sections. We will continue to outline our proactive and reactive efforts against CSEA, as well as our reports to NCMEC, in a separate section. 

Second, we have provided expanded information on appeals, outlining total appeals and reinstatements by Community Guidelines enforcements. 

Finally, we have expanded our European Union section, providing increased insight into Snap’s EU activities. Specifically, we are publishing our most recent DSA Transparency Report and additional metrics regarding our CSEA media scanning.

For more information about our policies for combating online harms, and plans to continue evolving our reporting practices, please read our recent Safety & Impact blog about this Transparency Report. To find additional safety and privacy resources on Snapchat, see our About Transparency Reporting tab at the bottom of the page.

Please note that the most up-to-date version of this Transparency Report can be found in the en-US locale.

Overview of Content and Account Violations

From July 1 - December 31, 2023, Snap enforced against 5,376,714 pieces of content globally that were reported to us and violated our Community Guidelines.

During the reporting period, we saw a Violative View Rate (VVR) of 0.01 percent, which means that out of every 10,000 Snap and Story views on Snapchat, 1 contained content found to violate our policies. The median turnaround time to enforce reported content was ~10 minutes.

Analysis of Content and Account Violations

Our overall reporting and enforcement rates remained fairly similar to the previous six months. This cycle, we saw an approximate 10% increase in total content and account reports.

The Israel-Hamas conflict began during this period, and as a result we saw an uptick in violative content. Total reports related to hate speech increased by ~61%, while total content enforcements of hate speech increased by ~97% and unique account enforcements increased by ~124%. Terrorism & Violent extremism reports and enforcements have also increased, though they comprise <0.1% of the total content enforcements on our platform. Our Trust & Safety teams continue to remain vigilant as global conflicts arise in order to help keep Snapchat safe. We have also expanded our transparency report to include more information at a global and country-level regarding the total reports, content enforced, and unique accounts enforced for violations of our Terrorism & Violent Extremism policy. 

Combating Child Sexual Exploitation & Abuse

Sexual exploitation of any member of our community, especially minors, is illegal, abhorrent, and prohibited by our Community Guidelines. Preventing, detecting, and eradicating Child Sexual Exploitation and Abuse (CSEA) on our platform is a top priority for Snap, and we continually evolve our capabilities to combat these and other crimes.

We use active technology detection tools, such as PhotoDNA robust hash-matching and Google’s Child Sexual Abuse Imagery (CSAI) Match to identify known illegal images and videos of child sexual abuse, respectively, and report them to the U.S. National Center for Missing and Exploited Children (NCMEC), as required by law. NCMEC then, in turn, coordinates with domestic or international law enforcement, as required.

In the second half of 2023, we proactively detected and actioned 59% of the total child sexual exploitation and abuse violations reported. This reflects a 39% total decrease from the previous period due to enhancements in Snapchatters’ options for reporting, increasing our visibility of potential CSEA sent on Snapchat. 

*Note that each submission to NCMEC can contain multiple pieces of content. The total individual pieces of media submitted to NCMEC is equal to our total content enforced. We also have excluded retracted submissions to NCMEC from this number.

Self-harm and Suicide Content

We care deeply about the mental health and well-being of Snapchatters, which continues to inform our decisions to build Snapchat differently. As a platform designed for communications between and among real friends, we believe Snapchat can play a unique role in empowering friends to help each other in difficult times.

When our Trust & Safety team becomes aware of a Snapchatter in distress, they can forward self-harm prevention and support resources, and notify emergency response personnel when appropriate. The resources that we share are available on our global list of safety resources, and are publicly available to all Snapchatters.

Appeals

In our previous report, we introduced metrics on appeals, where we highlighted the number of times users asked us to re-review our initial moderation decision against their account. In this report, we have expanded our appeals to capture the full range of our policy categories for account-level violations.

* Stopping the spread of content or activity related to child sexual exploitation is a top priority. Snap devotes significant resources toward this goal and has zero tolerance for such conduct.  Special training is required to review CSE appeals, and there is a limited team of agents that handles these reviews due to the graphic nature of the content.  In the fall of 2023, Snap implemented policy changes that affected the consistency of certain CSE enforcements; we have addressed these inconsistencies through agent re-training and quality assurance.  We expect that Snap’s next transparency report will reveal progress toward improving response times for CSE appeals and improving the precision of initial enforcement actions.

Ads Moderation

Snap is committed to ensuring that all ads are fully compliant with our advertising policies. We believe in a responsible and respectful approach to advertising, creating a safe and enjoyable experience for all of our users. Below we have included insight into our moderation for paid advertisements on Snapchat. Note that ads on Snapchat can be removed for a variety of reasons as outlined in Snap’s Advertising Policies, including deceptive content, adult content, violent or disturbing content, hate speech, and intellectual property infringement. Additionally, you can now find Snapchat’s Ads Gallery in the navigation bar of this transparency report.

Regional & Country Overview

This section provides an overview of the enforcement of our Community Guidelines in a sampling of geographic regions. Our Guidelines apply to all content on Snapchat—and all Snapchatters—across the globe, regardless of location.

Information for individual countries, including all EU Member States, is available for download via the attached CSV file.