Privacy and Safety Hub

Our Transparency Report for the first half of 2022

29 November 2022

Today, we are releasing our latest transparency report, which covers the first half of 2022. 
At Snap, the safety and well-being of our community is our top priority, and our bi-annual transparency reports are an essential tool we use to share key information and hold ourselves accountable. 
Since our first transparency report in 2015, we've been on a mission to make each report more informative, digestible and effective than the last. In our latest report, we made various additions and improvements to help our community better understand our reporting, and build on our commitment to making these reports more comprehensive and informative. 
Making false information data available at the country level
For the first time, we are introducing “False Information” as a stand-alone category available at the country level, building on our previous practice of reporting false information globally. We are one of the only platforms to provide this information by country. This past half-year, we enforced a total of 4,877 pieces of false or misleading content as potentially harmful or malicious. We have always taken a different approach to prevent the spread of false information of Snapchat, starting with the design of our platform. Across Snapchat, we don't allow unvetted content to go viral, and when we find content that violates our Community Guidelines, our policy is to take it down, immediately reducing the risk of it being shared more widely. Our approach to enforcing against content that includes false information is similarly straightforward: We remove it. 
With the recent US mid-term elections and other elections happening globally, we believe detailed, country-specific data about our enforcement against false information is valuable. You can read more about how we prevent the spread of false information on Snapchat here. 
Combating child sexual exploitation & abuse 
The sexual exploitation of any member of our community, especially minors, is illegal, abhorrent and prohibited by our Community Guidelines. Preventing, detecting and eradicating Child Sexual Exploitation and Abuse Imagery (CSEAI) on our platform is a top priority for us, and we are continuously evolving our capabilities to help combat this type of abuse on our platform. In the first half of 2022, we proactively detected and actioned on 94% of the total child sexual exploitation and abuse violations in this report – a 6% increase since our previous report.
We are also providing updated language and increased insight into our efforts to combat CSEAI. We are now sharing the total number of CSEAI content that we removed, as well as the total number of CSEAI reports that our Trust and Safety teams made to the US National Centre for Missing and Exploited Children (NCMEC).
Introducing a Policy & Data Definitions Glossary
We added a Policy & Data Definitions Glossary to be included in all reports going forward. Our goal with this glossary is to provide increased transparency around the terms and metrics we use, clearly outlining what forms of violating content are included and enforced against under each category. For example, if readers aren't sure what we mean by “Threats and Violence”, “Hate Speech”, “Other Regulated Goods” or other content categories, they can easily refer to the glossary for a description.
Proactively removing violating content 
When looking at the data in the report, it's also important to note that the figures for total reports and enforcement only count the content which is reported to us. It does not count the instances where Snap proactively detected and took action against content before it was reported to us. We believe that the improvements we've made to our proactive detection efforts played a large role in the decrease of total reports, enforcement numbers and turnaround times from our latest report in key categories. Because our enhanced, automated-detection tools identified and removed content before it had a chance to reach Snapchatters, we saw a decrease in reactive content enforcements (i.e., reports from Snapchatters). 
Specifically, since our last report, we saw a 44% decrease in threatening and violent content enforcements on reports from Snapchatters, as well as a 37% decrease in dug content enforcements and a 34% decrease in hate speech content enforcements. On average, our median turnaround time for removing violating content has improved 33% since the last half year, to just over one minute.
While Snapchat has evolved over the years, our commitment to transparency and prioritising the safety and well-being of our community remains unchanged. We will continue to hold ourselves accountable and communicate updates on our progress.