Our Transparency Report for the First Half of 2021

November 22, 2021

Today, we’re releasing our transparency report for the first half of 2021, which covers the period of January 1 - June 30 of this year. As with recent reports, this installment shares data about violations of our Community Guidelines globally during the period; the number of content reports we received and enforced against across specific categories of violations; how we responded to requests from law enforcement and governments; our enforcements broken down by country; the Violative View Rate of Snapchat content; and incidences of false information on the platform. 

We’re adding several updates to our reporting this period, including noting our median turnaround time in minutes from hours to provide more detail about our operational practices and efficacy.

Every day, on average more than five billion Snaps are created using our Snapchat camera. From January 1 - June 30, 2021, we enforced against 6,629,165 pieces of content globally that violated our Guidelines. During this period, our Violative View Rate (VVR) was 0.10 percent, which means that out of every 10,000 views of content on Snap, 10 contained content that violated our Guidelines. Additionally, we significantly improved our time responding to reports of violations, in particular for sexually explicit content, harassment and bullying, illegal and counterfeit drugs, and other regulated goods. 

Our Work To Combat Child Sexual Abuse Material 

The safety of our community is a top priority. As a platform built for communicating with real friends, we intentionally designed Snapchat to make it harder for strangers to find young people. For example, Snapchatters cannot see each others’ friend lists, and by default, cannot receive a message from someone who isn’t already a friend.

We have zero tolerance for abuse directed at any member of our community, especially minors, which is illegal, unacceptable and prohibited by our Community Guidelines. We work diligently to combat these violations by evolving our capabilities to prevent, detect and eradicate abuse on our platform including Child Sexual Abuse Material (CSAM) and other types of child sexually exploitative content.

Our Trust and Safety teams use proactive detection tools, such as PhotoDNA and Child Sexual Abuse Imagery (CSAI) Match technology to identify known illegal images and videos of CSAM and report them to the National Center for Missing and Exploited Children (NCMEC). NCMEC then, in turn, coordinates with domestic or international law enforcement. 

In the first half of 2021, 5.43 percent of the total number of accounts we enforced against globally contained CSAM. Of this, we proactively detected and actioned 70 percent of CSAM violations. This increased proactive detection capability combined with a rise in CSAM-spreading coordinated spam attacks resulted in a notable increase in this category for this reporting period. 

We have continued to expand our partnerships with safety experts as well as our in-app features to help educate Snapchatters about the risks of contact with strangers and how to use in-app reporting to alert our Trust and Safety teams to any type of concern or abuse. Additionally, we have continued to add partners to our trusted flagger program, which provides vetted safety experts with a confidential channel to report emergency escalations, such as an imminent threat to life or a case involving CSAM. We work closely with these partners to provide safety education, wellness resources, and other reporting guidance so they can help support the Snapchat community. 

Our Approach to the Spread of False Information

The period of time this transparency report covers further underscores how critical it is to ensure that the public has access to accurate and credible information. We regularly assess and invest in new means of protecting our community of Snapchatters from the spread of false information related to democratic processes, public health, and COVID-19.

In the first half of 2021, globally, we enforced against a combined total of 2,597 accounts and pieces of content for violations of our false information guidelines, almost half the number of violations from the previous reporting period. Since content on Discover and Spotlight are proactively moderated to prevent distribution of violating content at scale, the majority of these violations came from private Snaps and Stories, and the majority of these violations were made known to us via our own active moderation efforts, as well as reports from Snapchatters.  

We have always believed that when it comes to harmful content, it isn’t enough just to think about policies and enforcement — platforms need to consider their fundamental architecture and product design. From the beginning, Snapchat was built differently than traditional social media platforms, to support our primary use case of talking with close friends — rather than an open newsfeed where anyone has the right to distribute anything to anyone. Snapchat’s very design limits virality, which removes incentives for content that appeals to people’s worst instincts thereby limiting concerns associated with the spread of illegal and harmful content.

This approach also carries into our work to prevent the spread of extremist content. During the reporting period, we removed five accounts for violations of our prohibition of terrorist and extremist content, a slight decrease from the last reporting cycle. At Snap, we’re regularly monitoring developments in this space, seeking to mitigate any potential vectors for abuse on our platform. Both our platform architecture and the design of our Group Chat functionality help to limit the spread of harmful content and opportunities to organize. We offer Group Chats, but they are limited in size, are not recommended by algorithms, and are not discoverable on our platform for anyone not a member of that particular Group. 

During this period, we continued to proactively promote factual public safety information about COVID-19 to our community, including through coverage provided by our Discover editorial partners, through public service announcements (PSAs), as well as Q&As with public health officials, agencies and medical experts, and through creative tools, such as Augmented Reality Lenses and filters — all designed to remind Snapchatters of expert public health guidance. Earlier this year, as vaccines became available for young people in the U.S., we launched a new initiative with the White House to help Snapchatters answer common questions and, in July, we teamed up with the UK’s National Health Service on a similar effort. 

Going forward, we are committed to continuing to make our transparency reports more comprehensive and helpful to the many stakeholders that care deeply about online safety, transparency and multi-sector accountability. We are constantly evaluating how we can strengthen our comprehensive efforts to combat harmful content and bad actors, and are grateful to the many security and safety partners and collaborators that regularly help us to improve.

Späť na Správy