25 April 2024
25 April 2024
To further provide insight into Snap’s safety efforts and the nature and volume of content reported on our platform, we publish this transparency report twice a year. We are committed to continuing to make these reports more comprehensive and informative to the safety and well-being of our community, and the many stakeholders who care deeply about our content moderation and law enforcement practices.
This Transparency Report covers the second half of 2023 (1 July – 31 December). As with our previous reports, we share data about the global number of in-app content and account-level reports we received and enforced across specific categories of policy violations; how we responded to requests from law enforcement and governments; and our enforcement actions broken down by country.
As part of our ongoing commitment to continually improve our transparency reports, we are introducing a few new elements with this release.
First, we have expanded our main table to include reports and enforcement against content and accounts tied to both Terrorism & Violent Extremism and Child Sexual Exploitation & Abuse (CSEA). In previous reports, we had highlighted account deletions made in response to those violations in separate sections. We will continue to outline our proactive and reactive efforts against CSEA, as well as our reports to NCMEC, in a separate section.
Second, we have provided expanded information on appeals, outlining total appeals and reinstatements by Community Guidelines enforcements.
Finally, we have expanded our European Union section, providing increased insight into Snap’s EU activities. Specifically, we are publishing our most recent DSA Transparency Report and additional metrics regarding our CSEA media scanning.
For more information about our policies for combating online harms, and plans to continue evolving our reporting practices, please read our recent Safety & Impact blog about this Transparency Report. To find additional safety and privacy resources on Snapchat, see our About Transparency Reporting tab at the bottom of the page.
Please note that the most up-to-date version of this Transparency Report can be found in the en-US locale.
Overview of Content and Account Violations
From 1 July – 31 December 2023, Snap enforced against 5,376,714 pieces of content globally that were reported to us and violated our Community Guidelines.
During the reporting period, we saw a Violative View Rate (VVR) of 0.01 percent, which means that out of every 10,000 Snap and Story views on Snapchat, 1 contained content found to violate our policies. The median turnaround time to enforce reported content was ~10 minutes.
Analysis of Content and Account Violations
Our overall reporting and enforcement rates remained fairly similar to the previous six months. This cycle, we saw an approximate 10% increase in total content and account reports.
The Israel-Hamas conflict began during this period, and as a result we saw an uptick in violative content. Total reports related to hate speech increased by ~61%, while total content enforcements of hate speech increased by ~97% and unique account enforcements increased by ~124%. Terrorism & Violent extremism reports and enforcements have also increased, though they comprise <0.1% of the total content enforcements on our platform. Our Trust & Safety teams continue to remain vigilant as global conflicts arise in order to help keep Snapchat safe. We have also expanded our transparency report to include more information at a global and country-level regarding the total reports, content enforced, and unique accounts enforced for violations of our Terrorism & Violent Extremism policy.