25 October 2023
13 December 2023
To provide insight into Snap’s safety efforts and the nature and volume of content reported on our platform, we publish transparency reports twice a year. We are committed to continuing to make these reports more comprehensive and informative to the many stakeholders who care deeply about our content moderation and law enforcement practices, as well as the well-being of our community.
This Transparency Report covers the first half of 2023 (1 January - 30 June). As with our previous reports, we share data about the global number of in-app content and account-level reports we received and enforced against across specific categories of policy violations; how we responded to requests from law enforcement and governments; and our enforcement actions broken down by country.
As part of our ongoing commitment to continually improve our transparency reports, we are introducing a few new elements with this release. We have added additional data points around our advertising practices and moderation, as well as content and account appeals. In alignment with the EU Digital Services Act, we’ve also added new contextual information around our operations in EU Member States, such as the number of content moderators and monthly active users (MAUs) in the region. Much of this information can be found throughout the report, and in our Transparency Centre's dedicated European Union page.
Finally, we have updated our Glossary with links to our Community Guidelines explainers, which provide additional context around our platform policy and operational efforts.
For more information about our policies for combating online harms, and plans to continue evolving our reporting practices, please read our recent Safety & Impact blog about this Transparency Report.
To find additional resources for safety and privacy on Snapchat, see our About Transparency Reporting tab at the bottom of the page.
Please note that the most up-to-date version of this Transparency Report can be found in the en-US locale.
Overview of Content and Account Violations
From 1 January - 30 June 2023, Snap enforced against 6,216,118 pieces of content globally that violated our policies.
During the reporting period, we saw a Violative View Rate (VVR) of 0.02 percent, which means that out of every 10,000 Snap and Story views on Snapchat, 2 contained content found to violate our policies.
*Correctly and consistently enforcing against false information is a dynamic process that requires up-to-date context and diligence. As we strive to continually improve the precision of our agents’ enforcement in this category, we have chosen, since H1 2022, to report figures in the "Content Enforced" and "Unique Accounts Enforced" categories that are estimated based on a rigorous quality-assurance review of a statistically significant portion of false information enforcements. Specifically, we sample a statistically significant portion of false information enforcements across each country and quality-check the enforcement decisions. We then use those quality-checked enforcements to derive enforcement rates with a 95% confidence interval (+/- 5% margin of error), which we use to calculate the false information enforcements reported in the Transparency Report.
Analysis of Content and Account Violations
Our overall reporting and enforcement rates remained fairly similar to the previous six months, with a few exceptions in key categories. We saw an approximate 3% decrease in total content and account reports and enforcements this cycle.
The categories with the most notable fluctuations were Harassment & Bullying, Spam, Weapons, and False Information. Harassment & Bullying saw a ~56% increase in total reports, and a subsequent ~39% increase in content and unique account enforcements. These increases in enforcements were coupled with a ~46% decrease in turnaround time, highlighting the operational efficiencies our team made in enforcing against this type of violating content. Similarly, we saw a ~65% increase in total reports for Spam, with a ~110% increase in content enforcements and ~80% increase in unique accounts enforced, while our teams also reduced turnaround time by ~80%. Our Weapons category saw a ~13% decrease in total reports, and a ~51% decrease in content enforcements and ~53% reduction in unique accounts enforced. Lastly, our False Information category saw a ~14% increase in total reports, but a ~78% decrease in content enforcements and ~74% decrease in unique accounts enforced. This can be attributed to the continued Quality Assurance (QA) process and resourcing we apply to false information reports, to ensure that our moderation teams are accurately catching and actioning false information on the platform.
Overall, while we saw generally similar figures as in the last period, we believe it is important to continue to improve the tools our community uses to actively and accurately report potential violations as they appear on the platform.