Snap Values
Transparency Report
1 January 2024 – 30 June 2024

Released:

5 December 2024

Updated:

5 December 2024

We publish this transparency report twice a year to provide insight into Snap’s safety efforts. We are committed to these efforts and continually strive to make these reports more comprehensive and informative for the many stakeholders who care deeply about our content moderation, law enforcement practices and the safety and wellbeing of the Snapchat community. 

This Transparency Report covers the first half of 2024 (January 1 – June 30). As with our previous reports, we share data about the global volume of in-app content and account-level reports our Trust & Safety teams received and enforced across specific categories of Community Guidelines violations; how we responded to requests from law enforcement and governments; and how we responded to notices of copyright and trademark infringement. We also provide country-specific insights in the files linked at the bottom of this page.

As part of our ongoing commitment to continually improve our transparency reports, we are also introducing new data highlighting our proactive efforts to detect and enforce against a wider range of violations of our Community Guidelines. We have included this data at both the global and country levels within this report and will continue to do so going forward. We have also corrected a labelling error in our previous reports: where we previously referred to “Total Content Enforced”, we now refer to “Total Enforcements” to reflect the fact that the data provided in the relevant columns includes both content-level and account-level enforcements.

For more information about our policies for combating potential online harms, and plans to continue evolving our reporting practices, please read our recent Safety & Impact blog about this Transparency Report. To find additional safety and privacy resources on Snapchat, see our About Transparency Reporting tab at the bottom of the page.

Please note that the most up-to-date version of this Transparency Report can be found in the EN-US locale.

Overview of Our Trust & Safety Teams' Actions to Enforce Our Community Guidelines

Our Trust & Safety teams enforce our Community Guidelines both proactively (through the use of automated tools) and reactively (in response to reports), as further detailed in the following sections of this report. In this reporting cycle (H1 2024), our Trust & Safety teams took the following enforcement actions: 

Total Enforcements

Total Unique Accounts Enforced

10,032,110

5,698,212

Below is a breakdown per type of Community Guidelines violations concerned, including the median turnaround time between the time we detected the violation (either proactively or upon receipt of a report) and the time we took final action on the relevant content or account:

Policy Reason

Total Enforcements

Total Unique Accounts Enforced

Median Turnaround Time (minutes) From Detection To Final Action

Sexual Content

3,860,331

2,099,512

2

Child Sexual Exploitation

961,359

577,682

23

Harassment and Bullying

2,716,966

2,019,439

7

Threats & Violence

199,920

156,578

8

Self-Harm & Suicide

15,910

14,445

10

False Information

6,539

6,176

1

Impersonation

8,798

8,575

2

Spam

357,999

248,090

1

Drugs

1,113,629

718,952

6

Weapons

211,860

136,953

1

Other Regulated Goods

247,535

177,643

8

Hate Speech

324,478

272,025

27

Terrorism & Violent Extremism

6,786

4,010

5

During the reporting period, we saw a Violative View Rate (VVR) of 0.01 percent, which means that out of every 10,000 Snap and Story views on Snapchat, 1 contained content found to violate our Community Guidelines.

Community Guidelines Violations Reported to Our Trust & Safety Teams

From January 1 – 30 June 2024, in response to in-app reports of violations of our Community Guidelines, Snap’s Trust & Safety teams took a total of 6,223,618 enforcement actions globally, including enforcements against 3,842,507 unique accounts. The median turnaround time for our Trust & Safety teams to take enforcement action in response to those reports was ~24 minutes. A breakdown per reporting category is provided below. 

Total Content & Account Reports

Total Enforcements

Total Unique Accounts Enforced

19,379,848

6,346,508

4,075,838

Policy Reason

Content & Account Reports

Total Enforcements

% of the Total Reports Enforced by Snap

Total Unique Accounts Enforced

Median Turnaround Time (minutes) From Detection To Final Action

Sexual Content

5,251,375

2,042,044

32.20%

1,387,749

4

Child Sexual Exploitation

1,224,502

469,389

7.40%

393,384

133

Harassment and Bullying

6,377,555

2,702,024

42.60%

2,009,573

7

Threats & Violence

1,000,713

156,295

2.50%

129,077

8

Self-Harm & Suicide

307,660

15,149

0.20%

13,885

10

False Information

536,886

6,454

0.10%

6,095

1

Impersonation

678,717

8,790

0.10%

8,569

2

Spam

1,770,216

180,849

2.80%

140,267

1

Drugs

418,431

244,451

3.90%

159,452

23

Weapons

240,767

6,473

0.10%

5,252

1

Other Regulated Goods

606,882

199,255

3.10%

143,560

8

Hate Speech

768,705

314,134

4.90%

263,923

27

Terrorism & Violent Extremism

197,439

1,201

<0.1%

1,093

4

Our overall reporting volumes remained fairly stable in H1 2024, as compared to the previous six months. This cycle, we saw an increase in total enforcements and total unique accounts enforced by approximately 16%.

Over the last 12 months, Snap introduced new reporting mechanisms for users, which account for changes to our reported and enforced volumes and for increases in turnaround times in this reporting period (H1 2024). Specifically:

  • Group Chat Reporting: We introduced Group Chat Reporting on 13 October 2023, which enables users to report abuse occurring in a multi-person chat. This change impacted the makeup of our metrics across reporting categories (because some potential harms are more likely to occur in a chat context) and increased report actionability. 

  • Account Reporting Enhancements: We also evolved our Account Reporting feature to give reporting users an option to submit chat evidence when reporting an account suspected of being operated by a bad actor. This change, which provides us with greater evidence and context to assess account reports, launched on 29 February 2024. 


Chat Reports, and especially Group Chat Reports, are among the most complex and time-consuming to review, which drove up turnaround times across the board. 

Reporting for suspected Child Sexual Exploitation & Abuse (CSEA), Harassment & Bullying and Hate Speech were particularly impacted by the two changes described above, and by shifts in the broader ecosystem. Specifically:

  • CSEA: We observed an increase in CSEA-related reports and enforcements in H1 2024. Specifically, we saw a 64% increase in total in-app reports by users, an 82% increase in total enforcements and a 108% increase in total unique accounts enforced. These increases are largely driven by the introduction of Group Chat and Account Reporting functionalities. Given the sensitive nature of this moderation queue, a select team of highly trained agents is assigned to review reports of potential CSEA-related violations. The influx of additional reports combined with our teams adapting to new trainings has resulted in an increase in turnaround times. Moving forward, we have increased the size of our global vendor teams significantly to reduce turnaround times and accurately enforce on reports of potential CSEA. We expect that our H2 2024 Transparency Report will reflect the fruits of this effort with a materially improved turnaround time. 

  • Harassment & Bullying: Based on reports, we have observed that Harassment & Bullying disproportionately occurs in chats, and particularly group chats. The improvements we introduced to Group Chat Reporting and Account Reporting help us take more comprehensive action when assessing reports in this reporting category. Additionally, as of this period, we require users to input a comment when submitting a harassment and bullying report. We review this comment to contextualise each report. Together, these changes led to material increases in total enforcements (+91%), total unique accounts enforced (+82%) and turnaround time (+245 mins) for corresponding reports. 

  • Hate Speech: In H1 2024, we observed increases in reported content, total enforcements and turnaround time for Hate Speech. Specifically, we saw a 61% increase in in-app reports, a 127% increase in total enforcements and a 125% increase in total unique accounts enforced. This was due, in part, to improvements in our chat reporting mechanisms (as previously discussed), and was further exacerbated by the geopolitical environment, particularly the continuation of the Israel-Hamas conflict. 

This reporting period, we saw a ~65% decrease in total enforcements and a ~60% decrease in total unique accounts enforced in response to reports of suspected Spam & Abuse, reflecting improvements in our proactive detection and enforcement tools. We saw similar declines in total enforcements in response to reports of content relating to Self Harm and Suicide (~80% decreases), reflecting our updated victim-centric approach, according to which our Trust & Safety teams will, in appropriate cases, send relevant users self-help resources, rather than take an enforcement action against those users. This approach was informed by members of our Safety Advisory Board, including a paediatric professor and medical doctor who specialises in interactive media and internet disorders.

Our Efforts to Proactively Detect and Enforce Against Violations of Our Community Guidelines

Proactive Detection & Enforcement of our Community Guidelines


We deploy automated tools to proactively detect and, in some cases, enforce against violations of our Community Guidelines. These tools include hash-matching tools (including PhotoDNA and Google Child Sexual Abuse Imagery (CSAI) Match), abusive language detection tools (which detect and enforce based on an identified and regularly updated list of abusive keywords and emojis) and multi-modal artificial intelligence / machine learning technology. 

In H1 2024, we took the following enforcement actions after proactively detecting, through the use of automated tools, violations of our Community Guidelines:

Total Enforcements

Total Unique Accounts Enforced

3,685,602

1,845,125

Policy Reason

Total Enforcements

Total Unique Accounts Enforced

Median Turnaround Time (minutes) From Detection To Final Action

Sexual Content

1,818,287

828,590

<1

Child Sexual Exploitation

491,970

188,877

1

Harassment and Bullying

14,942

11,234

8

Threats & Violence

43,625

29,599

9

Self-Harm & Suicide

761

624

9

False Information

85

81

10

Impersonation

8

6

19

Spam

177,150

110,551

<1

Drugs

869,178

590,658

5

Weapons

205,387

133,079

<1

Other Regulated Goods

48,280

37,028

9

Hate Speech

10,344

8,683

10

Terrorism & Violent Extremism

5,585

2,951

21

Combatting Child Sexual Exploitation & Abuse

Combating child sexual exploitation & abuse 

Sexual exploitation of any member of our community, especially minors, is illegal, abhorrent and prohibited by our Community Guidelines. Preventing, detecting and eradicating Child Sexual Exploitation and Abuse (CSEA) on our platform is a top priority for Snap, and we continually evolve our capabilities to combat these and other crimes.

We use active technology detection tools, such as PhotoDNA robust hash-matching and Google’s Child Sexual Abuse Imagery (CSAI) Match to identify known illegal images and videos of CSEA, respectively. In addition, in some cases, we use behavioural signals to enforce against other potentially illegal CSEA activity. We report CSEA-related content to the US National Centre for Missing and Exploited Children (NCMEC), as required by law. NCMEC then, in turn, coordinates with domestic or international law enforcement, as required.

In the first half of 2024, we took the following actions upon detecting CSEA on Snapchat (either proactively or upon receiving a report):

Total Content Enforced

Total Accounts Disabled

Total Submissions to NCMEC*

1,228,929

242,306

417,842

*Note that each submission to NCMEC can contain multiple pieces of content. The total individual pieces of media submitted to NCMEC is equal to our total content enforced.

Our Efforts to Provide Resources and Support to Snapchatters in Need

We care deeply about the mental health and well-being of Snapchatters, which continues to inform our decisions to build Snapchat differently. As a platform designed for communications between and among real friends, we believe Snapchat can play a unique role in empowering friends to help each other in difficult times. This is why we have developed resources and support for Snapchatters in need. 

Our Here For You search tool shows resources from expert local partners when users search for certain topics related to mental health, anxiety, depression, stress, suicidal thoughts, grief and bullying. We have also developed a page dedicated to financial sextortion and other sexual risks and harms, in an effort to support those in distress. Our global list of safety resources is publicly available to all Snapchatters, in our Privacy, Safety & Policy Hub. 

When our Trust & Safety teams become aware of a Snapchatter in distress, they can forward self-harm prevention and support resources, and notify emergency response personnel when appropriate. The resources that we share are available on our global list of safety resources, and are publicly available to all Snapchatters.

Total Times Suicide Resources Shared

64,094

Appeals

Below we provide information about appeals we received from users requesting a review of our decision to lock their account:

Policy Reason

Total Appeals

Total Reinstatements

Total Decisions Upheld

Median Turnaround Time (days) to Process Appeals

TOTAL

493,782

35,243

458,539

5

Sexual Content

162,363

6,257

156,106

4

Child Sexual Exploitation

102,585

15,318

87,267

6

Harassment and Bullying

53,200

442

52,758

7

Threats & Violence

4,238

83

4,155

5

Self-Harm & Suicide

31

1

30

5

False Information

3

0

3

<1

Impersonation

847

33

814

7

Spam

19,533

5,090

14,443

9

Drugs

133,478

7,598

125,880

4

Weapons

4,678

136

4,542

6

Other Regulated Goods

9,153

168

8,985

6

Hate Speech

3,541

114

3,427

7

Terrorism & Violent Extremism

132

3

129

9

Regional & Country Overview

This section provides an overview of our Trust & Safety teams’ actions to enforce our Community Guidelines, both proactively and in response to in-app reports of violations, in a sampling of geographic regions. Our Community Guidelines apply to all content on Snapchat – and all Snapchatters – across the globe, regardless of location.

Information for individual countries, including all EU Member States, is available for download via the attached CSV file.

Overview of our Trust & Safety Teams’ Actions to Enforce our Community Guidelines 

Region

Total Enforcements

Total Unique Accounts Enforced

North America

3,828,389

2,117,048

Europe

2,807,070

1,735,054

Rest of World

3,396,651

1,846,110

Total

10,032,110

5,698,212

Community Guidelines Violations Reported to our Trust & Safety Teams

Region

Content & Account Reports

Total Enforcements

Total Unique Accounts Enforced

North America

5,916,815

2,229,465

1,391,304

Europe

5,781,317

2,085,109

1,378,883

Rest of World

7,681,716

2,031,934

1,319,934

Total

19,379,848

6,346,508

4,090,121

Proactive Detection and Enforcement of our Community Guidelines

Total Enforcements

Total Unique Accounts Enforced

1,598,924

837,012

721,961

417,218

1,364,717

613,969

3,685,602

1,868,199

Ads Moderation

Snap is committed to ensuring that all ads are fully compliant with our advertising policies. We believe in a responsible and respectful approach to advertising, creating a safe and enjoyable experience for all of our users. All advertisements are subject to our review and approval. In addition, we reserve the right to remove ads, including in response to user feedback, which we take seriously. 


Below we have included insight into our moderation for paid advertisements that are reported to us following their publication on Snapchat. Note that ads on Snapchat can be removed for a variety of reasons as outlined in Snap’s Advertising Policies, including deceptive content, adult content, violent or disturbing content, hate speech and intellectual property infringement. Additionally, you can now find Snapchat’s Ads Gallery in the navigation bar of this transparency report.

Total Ads Reported

Total Ads Removed

43,098

17,833