Transparency Report
January 1, 2022 – June 30, 2022

Released:

November 29, 2022

Updated:

November 29, 2022

To provide insight into Snap’s safety efforts and the nature and volume of content reported on our platform, we publish transparency reports twice a year. We are committed to continuing to make these reports more comprehensive and informative to the many stakeholders who care deeply about our content moderation and law enforcement practices, and the well-being of our community. 

This report covers the first half of 2022 (January 1 - June 30). As with our previous reports, we share data about the global number of in-app content and account-level reports we received and enforced against across specific categories of violations; how we responded to requests from law enforcement and governments; and our enforcement actions broken down by country. It also captures recent additions to this report, including the Violative View Rate of Snapchat content, potential trademark violations, and incidences of false information on the platform.

As part of our ongoing commitment toward improving our transparency reports, we are introducing several new elements to this report. For this installment and going forward, we are adding a glossary of the terms used throughout the report. Our goal is to provide increased transparency around such terms, clearly denoting what forms of violating content are included and enforced against under each category. For the first time, we are also introducing false information as a stand-alone category at the country level, building on our previous practice of reporting false information globally. 

Additionally, we are providing increased insight into our efforts to combat Child Sexual Exploitation and Abuse Imagery (CSEAI). Moving forward, we will be sharing insight on total CSEAI content that we enforced against by removing, as well as the total number of CSEAI reports* (i.e., “CyberTips”) that we made to the U.S. National Center for Missing and Exploited Children (NCMEC). 

For more information about our policies for combating online harms, and plans to continue evolving our reporting practices, please read our recent Safety & Impact blog about this transparency report. 

To find additional resources for safety and privacy on Snapchat, see our About Transparency Reporting tab at the bottom of the page.

Overview of Content and Account Violations

From January 1 - June 30, 2022, we enforced against 5,688,970 pieces of content globally that violated our policies. Enforcement actions include removing the offending content or terminating the account in question. 

During the reporting period, we saw a Violative View Rate (VVR) of 0.04 percent, which means that out of every 10,000 Snap and Story views on Snapchat, 4 contained content that violated our policies. 

Expanded violations

Combating Child Sexual Exploitation & Abuse

The sexual exploitation of any member of our community, especially minors, is illegal, abhorrent, and prohibited by our Community Guidelines. Preventing, detecting, and eradicating Child Sexual Exploitation and Abuse Imagery (CSEAI) on our platform is a top priority for us, and we continuously evolve our capabilities to combat these and other types of crimes.

Our Trust and Safety teams use active technology detection tools, such as PhotoDNA robust hash-matching and Google’s Child Sexual Abuse Imagery (CSAI) Match to identify known illegal images and videos of child sexual abuse, respectively, and report them to the U.S. National Center for Missing and Exploited Children (NCMEC), as required by law. NCMEC then, in turn, coordinates with domestic or international law enforcement, as required.

In the first half of 2022, we proactively detected and actioned 94 per cent of the total child sexual exploitation and abuse violations reported here — a 6 per cent increase since our prior report.

*Note that each submission to NCMEC can contain multiple pieces of content. The total individual pieces of media submitted to NCMEC is equal to our total content enforced.

Terrorist and Violent Extremist Content

During the reporting period, we removed 73 accounts for violations of our policy prohibiting terrorist and violent extremist content.

At Snap, we remove terrorist and violent extremism content reported through multiple channels. These include encouraging users to report terrorist and violent extremist content through our in-app reporting menu, and we work closely with law enforcement to address terrorism and violent extremism content that may appear on Snap.

Self-harm and Suicide Content

We care deeply about the mental health and well-being of Snapchatters, which has – and continues to – inform our decisions to build Snapchat differently. As a platform designed for communications between real friends, we believe Snapchat can play a unique role in empowering friends to help each other through difficult moments.

When our Trust & Safety team recognises a Snapchatter in distress, they can forward self-harm prevention and support resources, and notify emergency response personnel where appropriate. The resources we share are available on our global list of safety resources, and these are publicly available to all Snapchatters.

Country overview

This section provides an overview of the enforcement of our Community Guidelines in a sampling of geographic regions. Our Guidelines apply to all content on Snapchat—and all Snapchatters—across the globe, regardless of location.

Information for individual countries is available for download via the attached CSV file: