Transparency Report

July 1, 2022 – December 31, 2022

Released:

June 20, 2023

Updated:

June 20, 2023

To provide insight into Snap’s safety efforts and the nature and volume of content reported on our platform, we publish transparency reports twice a year. We are committed to continuing to make these reports more comprehensive and informative to the many stakeholders who care deeply about our content moderation and law enforcement practices, as well as the  well-being of our community. 

This report covers the second half of 2022 (July 1 - December 31). As with our previous reports, we share data about the global number of in-app content and account-level reports we received and enforced against across specific categories of policy violations; how we responded to requests from law enforcement and governments; and our enforcement actions broken down by country. It also captures recent additions to this report, including the Violative View Rate of Snapchat content, potential trademark violations, and incidences of false information on the platform.

As part of our ongoing commitment to continually improve our transparency reports, we are introducing a few new elements with this release. We have added a section labeled “Analysis of Content and Account Violations” wherein we assess major data changes relative to our previous reporting period. 

In addition, we have updated how we present data in our content and account violations tables, both on the landing page and our country sub-pages. Previously, we ordered violations from most to least content enforcements. To improve consistency, our ordering now mirrors our Community Guidelines. This came at the suggestion of Snap’s Safety Advisory Board, which independently educates, challenges, raises issues to, and advises Snap on how to help keep the Snapchat community safe.

Finally, we have updated our Glossary with links to our Community Guidelines Explainers, which provide additional context around our platform policy and operational efforts. 

For more information about our policies for combating online harms, and plans to continue evolving our reporting practices, please read our recent Safety & Impact blog about this transparency report. 

To find additional resources for safety and privacy on Snapchat, see our About Transparency Reporting tab at the bottom of the page.

Overview of Content and Account Violations

2022년 1월 1일부터 6월 30일까지, 저희는 전 세계적으로 저희 정책을 위반한 5,688,970개의 콘텐츠를 제재하였습니다. 제재 조치에는 불쾌한 콘텐츠나 해당 콘텐츠를 게시한 계정의 삭제 등이 포함됩니다.

보고 기간 중에 Snapchat의 위반 조회 비율(VVR)은 0.04%였으며, 이는 Snap 및 스토리 조회수 10,000개당 4개의 콘텐츠가 저희 정책을 위반한 것이었음을 의미합니다.

*Correctly and consistently enforcing against false information is a dynamic process that requires up-to-date context and diligence.  As we strive to continually improve the precision of our agents’ enforcement in this category, we have chosen, since H1 2022, to report figures in the "Content Enforced" and "Unique Accounts Enforced" categories that are estimated based on a rigorous quality-assurance review of a statistically significant portion of false information enforcements.  Specifically, we sample a statistically significant portion of false information enforcements across each country and quality-check the enforcement decisions.  We then use those quality-checked enforcements to derive enforcement rates with a 95% confidence interval (+/- 5% margin of error), which we use to calculate the false information enforcements reported in the Transparency Report.  

Analysis of Content and Account Violations

2022년 1월 1일부터 6월 30일까지, 저희는 전 세계적으로 저희 정책을 위반한 5,688,970개의 콘텐츠를 제재하였습니다. 제재 조치에는 불쾌한 콘텐츠나 해당 콘텐츠를 게시한 계정의 삭제 등이 포함됩니다.

보고 기간 중에 Snapchat의 위반 조회 비율(VVR)은 0.04%였으며, 이는 Snap 및 스토리 조회수 10,000개당 4개의 콘텐츠가 저희 정책을 위반한 것이었음을 의미합니다.

Combating Child Sexual Exploitation & Abuse

The sexual exploitation of any member of our community, especially minors, is illegal, abhorrent, and prohibited by our Community Guidelines. Preventing, detecting, and eradicating Child Sexual Exploitation and Abuse Imagery (CSEAI) on our platform is a top priority for Snap, and we continually evolve our capabilities to combat these and other crimes.

Our Trust & Safety team uses active technology detection tools, such as PhotoDNA robust hash-matching and Google’s Child Sexual Abuse Imagery (CSAI) Match to identify known illegal images and videos of child sexual abuse, respectively, and report them to the U.S. National Center for Missing and Exploited Children (NCMEC), as required by law. NCMEC then, in turn, coordinates with domestic or international law enforcement, as required.

In the second half of 2022, we proactively detected and actioned 94 percent of the total child sexual exploitation and abuse violations reported here.

**Note that each submission to NCMEC can contain multiple pieces of content. The total individual pieces of media submitted to NCMEC is equal to our total content enforced.

Terrorist and Violent Extremist Content

During the reporting period, we removed 132 accounts for violations of our policy prohibiting terrorist and violent extremist content.

At Snap, we remove terrorist and violent extremism content reported through multiple channels. These include encouraging users to report terrorist and violent extremist content through our in-app reporting menu, and we work closely with law enforcement to address terrorist and violent extremist content that may appear on Snap.

Self-harm and Suicide Content

저희는 Snapchat 사용자의 정신 건강과 웰빙에 깊은 관심을 갖고 있으며, 이는 Snapchat을 다른 방식으로 구축하겠다는 결정을 반영한 것이며 앞으로도 계속 그렇게 할 것입니다. 저희는 실제 친구 간의 의사소통을 위해 설계된 플랫폼으로서 Snapchat은 어려운 순간에 친구가 서로 도울 수 있도록 지원하는 고유한 역할을 할 수 있다고 믿습니다.

저희 신뢰 및 안전 팀이 고통 받고 있는 Snapchat 사용자를 확인하게 되는 경우, 자해 예방 및 지원 자료를 전달하고 적절한 경우, 비상 대응 요원에게 알릴 수 있습니다. Snap이 공유하는 자료는 안전 자료 글로벌 리스트에서 확인할 수 있으며, 모든 Snapchat 사용자에게 공개되어 있습니다.

Country Overview

이 섹션에서는 샘플 지역에서의 저희의 커뮤니티 가이드라인 시행에 대해 간략히 살펴봅니다. 당사의 가이드라인은 장소에 관계없이 전 세계 Snapchat의 모든 콘텐츠 및 사용자에게 적용됩니다.

개별 국가에 대한 정보는 첨부된 CSV 파일을 다운로드하여 확인할 수 있습니다.