Privacy, Safety, and Policy Hub
Transparency Report
January 1, 2023 – June 30, 2023

Released:

October 25, 2023

Updated:

December 13, 2023

To provide insight into Snap’s safety efforts and the nature and volume of content reported on our platform, we publish transparency reports twice a year. We are committed to continuing to make these reports more comprehensive and informative to the many stakeholders who care deeply about our content moderation and law enforcement practices, as well as the well-being of our community. 

This Transparency Report covers the first half of 2023 (January 1 - June 30). As with our previous reports, we share data about the global number of in-app content and account-level reports we received and enforced against across specific categories of policy violations; how we responded to requests from law enforcement and governments; and our enforcement actions broken down by country.

As part of our ongoing commitment to continually improve our transparency reports, we are introducing a few new elements with this release. We have added additional data points around our advertising practices and moderation, as well as content and account appeals. In alignment with the EU Digital Services Act, we’ve also added new contextual information around our operations in EU Member States, such as the number of content moderators and monthly active users (MAUs) in the region. Much of this information can be found throughout the report, and in our Transparency Center’s dedicated European Union page.

Finally, we have updated our Glossary with links to our Community Guidelines explainers, which provide additional context around our platform policy and operational efforts. 

For more information about our policies for combating online harms, and plans to continue evolving our reporting practices, please read our recent Safety & Impact blog about this Transparency Report. 

To find additional resources for safety and privacy on Snapchat, see our About Transparency Reporting tab at the bottom of the page.

Please note that the most up-to-date version of this Transparency Report can be found in the en-US locale.

Overview of Content and Account Violations

From January 1 - June 30, 2023, Snap enforced against 6,216,118 pieces of content globally that violated our policies.

During the reporting period, we saw a Violative View Rate (VVR) of 0.02 percent, which means that out of every 10,000 Snap and Story views on Snapchat, 2 contained content found to violate our policies.

*Correctly and consistently enforcing against false information is a dynamic process that requires up-to-date context and diligence.  As we strive to continually improve the precision of our agents’ enforcement in this category, we have chosen, since H1 2022, to report figures in the "Content Enforced" and "Unique Accounts Enforced" categories that are estimated based on a rigorous quality-assurance review of a statistically significant portion of false information enforcements. Specifically, we sample a statistically significant portion of false information enforcements across each country and quality-check the enforcement decisions. We then use those quality-checked enforcements to derive enforcement rates with a 95% confidence interval (+/- 5% margin of error), which we use to calculate the false information enforcements reported in the Transparency Report. 

Analysis of Content and Account Violations

Our overall reporting and enforcement rates remained fairly similar to the previous six months, with a few exceptions in key categories. We saw an approximate 3% decrease in total content and account reports and enforcements this cycle.

The categories with the most notable fluctuations were Harassment & Bullying, Spam, Weapons, and False Information. Harassment & Bullying saw a ~56% increase in total reports, and a subsequent ~39% increase in content and unique account enforcements. These increases in enforcements were coupled with a ~46% decrease in turnaround time, highlighting the operational efficiencies our team made in enforcing against this type of violating content. Similarly, we saw a ~65% increase in total reports for Spam, with a ~110% increase in content enforcements and ~80% increase in unique accounts enforced, while our teams also reduced turnaround time by ~80%. Our Weapons category saw a ~13% decrease in total reports, and a ~51% decrease in content enforcements and ~53% reduction in unique accounts enforced. Lastly, our False Information category saw a ~14% increase in total reports, but a ~78% decrease in content enforcements and ~74% decrease in unique accounts enforced. This can be attributed to the continued Quality Assurance (QA) process and resourcing we apply to false information reports, to ensure that our moderation teams are accurately catching and actioning false information on the platform.

Overall, while we saw generally similar figures as in the last period, we believe it is important to continue to improve the tools our community uses to actively and accurately report potential violations as they appear on the platform.

Combating Child Sexual Exploitation & Abuse

The sexual exploitation of any member of our community, especially minors, is illegal, abhorrent, and prohibited by our Community Guidelines. Preventing, detecting, and eradicating Child Sexual Exploitation and Abuse Imagery (CSEAI) on our platform is a top priority for Snap, and we continually evolve our capabilities to combat these and other crimes.

We use active technology detection tools, such as PhotoDNA robust hash-matching and Google’s Child Sexual Abuse Imagery (CSAI) Match to identify known illegal images and videos of child sexual abuse, respectively, and report them to the U.S. National Center for Missing and Exploited Children (NCMEC), as required by law. NCMEC then, in turn, coordinates with domestic or international law enforcement, as required.

In the first half of 2023, we proactively detected and actioned 98 percent of the total child sexual exploitation and abuse violations reported here — a 4% increase from the previous period.

**Note that each submission to NCMEC can contain multiple pieces of content. The total individual pieces of media submitted to NCMEC is equal to our total content enforced.

Terrorist and Violent Extremist Content

During the reporting period, January 1 2023- June 30 2023, we removed 18 accounts for violations of our policy prohibiting terrorist and violent extremist content.

At Snap, we remove terrorist and violent extremism content reported through multiple channels. We encourage users to report terrorist and violent extremist content through our in-app reporting menu, and we work closely with law enforcement to address terrorist and violent extremist content that may appear on Snap.

Self-harm and Suicide Content

We care deeply about the mental health and well-being of Snapchatters, which has informed – and continues to inform – our decisions to build Snapchat differently. As a platform designed for communications between real friends, we believe Snapchat can play a unique role in empowering friends to help each other through difficult times.

When our Trust & Safety team recognizes a Snapchatter in distress, they can forward self-harm prevention and support resources, and notify emergency response personnel when appropriate. The resources that we share are available on our global list of safety resources, and these are publicly available to all Snapchatters.

Appeals

As of this report, we are starting to report on the number of appeals by users whose accounts were locked for violation of our policies. We only reinstate accounts that our moderators determine were incorrectly locked. During this period, we are reporting on appeals relating to drug content.  In our next report, we’re looking forward to releasing more data relating to appeals stemming from other violations of our policies.

Ads Moderation

Snap is committed to steadfastly ensuring that all ads are fully compliant with our platform policies. We believe in a responsible and respectful approach to advertising, creating a safe and enjoyable experience for all of our users. Below we have included insight into our advertising moderation. Note that ads on Snapchat can be removed for a variety of reasons as outlined in Snap’s Advertising Policies, including deceptive content, adult content, violent or disturbing content, hate speech, and intellectual property infringement. Additionally, you can now find Snapchat’s Ads Gallery in the navigation bar of this transparency report. 

Country Overview

This section provides an overview of the enforcement of our Community Guidelines in a sampling of geographic regions. Our Guidelines apply to all content on Snapchat—and all Snapchatters—across the globe, regardless of location.

Information for individual countries is available for download via the attached CSV file: