June 20, 2025
June 20, 2025
We publish this transparency report twice a year to provide insights into Snap’s safety efforts. As part of our commitment to safety and transparency, we continually strive to make these reports more comprehensive and informative for the many stakeholders who care deeply about our content moderation, law enforcement practices, and the safety and well-being of the Snapchat community.
This Transparency Report covers the second half of 2024 (July 1 - December 31). We share global data about reports by users and proactive detection by Snap; enforcements by our Safety teams across specific categories of Community Guidelines violations; how we responded to requests from law enforcement and governments; and how we responded to notices of copyright and trademark infringement. We also provide country-specific insights in a series of linked pages.
To find additional safety and privacy resources on Snapchat, see our About Transparency Reporting tab at the bottom of the page.
Please note that the most up-to-date version of this Transparency Report is the English version.
Overview of Our Trust & Safety Teams' Actions to Enforce Our Community Guidelines
Our Safety teams enforce our Community Guidelines both proactively (through the use of automated detection tools) and reactively (in response to reports), as further detailed in the following sections of this report. In this reporting cycle (H2 2024), our Safety teams took the following numbers of enforcements:
Total Enforcements
Total Unique Accounts Enforced
10,032,110
5,698,212
Below is a breakdown per type of Community Guidelines violations concerned, including the median "turnaround time" between the time we detected the violation (either proactively or upon receipt of a report) and the time we took final action on the relevant content or account:
Policy Reason
Total Enforcements
Total Unique Accounts Enforced
Median Turnaround Time (minutes) From Detection To Final Action
Sexual Content
3,860,331
2,099,512
2
Child Sexual Exploitation
961,359
577,682
23
Harassment and Bullying
2,716,966
2,019,439
7
Threats & Violence
199,920
156,578
8
Self-Harm & Suicide
15,910
14,445
10
False Information
6,539
6,176
1
Impersonation
8,798
8,575
2
Spam
357,999
248,090
1
Drugs
1,113,629
718,952
6
Weapons
211,860
136,953
1
Other Regulated Goods
247,535
177,643
8
Hate Speech
324,478
272,025
27
Terrorism & Violent Extremism
6,786
4,010
5
During the reporting period, we saw a Violative View Rate (VVR) of 0.01 percent, which means that out of every 10,000 Snap and Story views on Snapchat, 1 contained content found to violate our Community Guidelines.
Community Guidelines Violations Reported to Our Trust & Safety Teams
From July 1 - December 31, 2024, in response to in-app reports of violations of our Community Guidelines, Snap’s Safety teams took a total of 6,346,508 enforcement actions globally, including enforcements against 4,075,838 unique accounts. The median turnaround time for our Safety teams to take enforcement action in response to those reports was ~6 minutes. A breakdown per reporting category is provided below.
Total Content & Account Reports
Total Enforcements
Total Unique Accounts Enforced
19,379,848
6,346,508
4,075,838
Policy Reason
Content & Account Reports
Total Enforcements
% of the Total Reports Enforced by Snap
Total Unique Accounts Enforced
Median Turnaround Time (minutes) From Detection To Final Action
Sexual Content
5,251,375
2,042,044
32.20%
1,387,749
4
Child Sexual Exploitation
1,224,502
469,389
7.40%
393,384
133
Harassment and Bullying
6,377,555
2,702,024
42.60%
2,009,573
7
Threats & Violence
1,000,713
156,295
2.50%
129,077
8
Self-Harm & Suicide
307,660
15,149
0.20%
13,885
10
False Information
536,886
6,454
0.10%
6,095
1
Impersonation
678,717
8,790
0.10%
8,569
2
Spam
1,770,216
180,849
2.80%
140,267
1
Drugs
418,431
244,451
3.90%
159,452
23
Weapons
240,767
6,473
0.10%
5,252
1
Other Regulated Goods
606,882
199,255
3.10%
143,560
8
Hate Speech
768,705
314,134
4.90%
263,923
27
Terrorism & Violent Extremism
197,439
1,201
<0.1%
1,093
4
Compared to the prior reporting period, we reduced median turnaround times across all policy categories by an average of 90%. This reduction was due in large part to a concerted effort to expand our review capacity as well as to improve our prioritization of reports based on severity of harm. We also made several targeted changes to our safety efforts in the reporting period, which had an impact on the data reported here, including the expansion of our efforts to enforce accounts for usernames and display names that violate our Community Guidelines, the introduction of increased reporting and protections for Communities on Snapchat, and the introduction of reporting options for additional types of media, such as voicenotes, directly to us in app.
These changes, as well as other safety efforts and external forces, particularly affected certain policy areas when compared to the prior reporting period. These policy categories include: Content related to suspected Child Sexual Exploitation & Abuse (CSEA), Harmful False Information, and Spam. Specifically:
CSEA: In the second half of 2024, we observed a 12% decrease in CSEA-related reports, and reduced our median turnaround time for responding to reported CSEA by 99%. These trends are driven largely by continued advancements in our proactive detection efforts, which enabled us to remove CSEA content before it could be reported to us and improvements we’ve made in our processes to review and take action on reports of CSEA more efficiently. Even with these improvements, our CSEA turnaround time is higher than in other policy areas because content is subject to a specialized process that includes double-review with a select team of specially trained agents.
Harmful False Information: We observed a 26% increase in the reported volume of reports related to Harmful False Information, driven primarily by political events, including the November 2024 US Election.
Spam: This reporting period, we saw a ~50% decrease in total enforcements and a ~46% decrease in total unique accounts enforced in response to reports of suspected Spam, reflecting improvements in our proactive detection and enforcement tools. This is a continuation of our efforts to target spam through account signals, and remove spam actors sooner in their activity on the platform. This effort was already underway during the last reporting period, during which total enforcements and total unique accounts enforced for Spam decreased by ~65% and ~60% respectively.
Our Efforts to Proactively Detect and Enforce Against Violations of Our Community Guidelines
We use automated tools to proactively detect and, in some cases, enforce against violations of our Community Guidelines. These tools include hash-matching technology (including PhotoDNA and Google’s Child Sexual Abuse Imagery (CSAI) Match), Google’s Content Safety API, and other custom technology designed to detect abusive text and media, sometimes leveraging artificial intelligence and machine learning.
In the second half of 2024, we took the following enforcement actions after proactively detecting violations of our Community Guidelines using automated detection tools:
Total Enforcements
Total Unique Accounts Enforced
3,685,602
1,845,125
Policy Reason
Total Enforcements
Total Unique Accounts Enforced
Median Turnaround Time (minutes) From Detection To Final Action
Sexual Content
1,818,287
828,590
<1
Child Sexual Exploitation
491,970
188,877
1
Harassment and Bullying
14,942
11,234
8
Threats & Violence
43,625
29,599
9
Self-Harm & Suicide
761
624
9
False Information
85
81
10
Impersonation
8
6
19
Spam
177,150
110,551
<1
Drugs
869,178
590,658
5
Weapons
205,387
133,079
<1
Other Regulated Goods
48,280
37,028
9
Hate Speech
10,344
8,683
10
Terrorism & Violent Extremism
5,585
2,951
21
Combatting Child Sexual Exploitation & Abuse
Sexual exploitation of any member of our community, especially minors, is illegal, abhorrent, and prohibited by our Community Guidelines. Preventing, detecting, and eradicating CSEA on our platform is a top priority for Snap, and we continually evolve our capabilities to combat these and other crimes.
We use active technology detection tools to help identify CSEA-related content. These tools include hash-matching tools (including PhotoDNA and Google’s CSAI Match, to identify known illegal images and videos of CSEA, respectively) and Google Content Safety API (to identify novel, "never-before-hashed" illegal imagery). In addition, in some cases, we use behavioral signals to enforce against other suspected CSEA activity. We report CSEA-related content to the U.S. National Center for Missing and Exploited Children (NCMEC), as required by law. NCMEC then coordinates with domestic or international law enforcement, as necessary.
In the second half of 2024, we took the following actions upon identifying CSEA on Snapchat (either proactively or upon receiving a report):
Total Content Enforced
Total Accounts Disabled
Total Submissions to NCMEC*
1,228,929
242,306
417,842
*Note that each submission to NCMEC can contain multiple pieces of content. The total individual pieces of media submitted to NCMEC is equal to our total content enforced.
Our Efforts to Provide Resources and Support to Snapchatters in Need
Snapchat empowers friends to help each other in difficult times by providing resources and support for Snapchatters in need.
Our Here For You search tool provides resources from experts when users search for certain topics related to mental health, anxiety, depression, stress, suicidal thoughts, grief and bullying. We have also developed a page dedicated to financial sextortion and other sexual risks and harms, in an effort to support those in distress. Our global list of safety resources is publicly available to all Snapchatters, in our Privacy, Safety & Policy Hub.
When our Safety teams become aware of a Snapchatter in distress, they are equipped to provide self-harm prevention and support resources, and to notify emergency services, as needed. The resources we share are available on our global list of safety resources, which is available to all Snapchatters.
Total Times Suicide Resources Shared
64,094
Appeals
Below we provide information about appeals we received from users requesting a review of our decision to lock their account in the second half of 2024:
Policy Reason
Total Appeals
Total Reinstatements
Total Decisions Upheld
Median Turnaround Time (days) to Process Appeals
TOTAL
493,782
35,243
458,539
5
Sexual Content
162,363
6,257
156,106
4
Child Sexual Exploitation
102,585
15,318
87,267
6
Harassment and Bullying
53,200
442
52,758
7
Threats & Violence
4,238
83
4,155
5
Self-Harm & Suicide
31
1
30
5
False Information
3
0
3
<1
Impersonation
847
33
814
7
Spam
19,533
5,090
14,443
9
Drugs
133,478
7,598
125,880
4
Weapons
4,678
136
4,542
6
Other Regulated Goods
9,153
168
8,985
6
Hate Speech
3,541
114
3,427
7
Terrorism & Violent Extremism
132
3
129
9
Regional & Country Overview
This section provides an overview of our Safety teams’ actions to enforce our Community Guidelines, both proactively and in response to in-app reports of violations, in a sampling of geographic regions. Our Community Guidelines apply to all content on Snapchat—and to all Snapchatters—across the globe, regardless of location.
Information for individual countries, including all EU Member States, is available for download via the attached CSV file.
Region
Total Enforcements
Total Unique Accounts Enforced
North America
3,828,389
2,117,048
Europe
2,807,070
1,735,054
Rest of World
3,396,651
1,846,110
Total
10,032,110
5,698,212
Region
Content & Account Reports
Total Enforcements
Total Unique Accounts Enforced
North America
5,916,815
2,229,465
1,391,304
Europe
5,781,317
2,085,109
1,378,883
Rest of World
7,681,716
2,031,934
1,319,934
Total
19,379,848
6,346,508
4,090,121
Total Enforcements
Total Unique Accounts Enforced
1,598,924
837,012
721,961
417,218
1,364,717
613,969
3,685,602
1,868,199
Ads Moderation
Snap is committed to ensuring that all ads are fully compliant with our advertising policies. We believe in a responsible approach to advertising, creating a safe experience for all Snapchatters. All ads are subject to our review and approval. In addition, we reserve the right to remove ads, including in response to user feedback, which we take seriously.
Below we have included insight into our moderation for paid advertisements that are reported to us following their publication on Snapchat. Note that ads on Snapchat can be removed for a variety of reasons as outlined in Snap’s Advertising Policies, including deceptive content, adult content, violent or disturbing content, hate speech, and intellectual property infringement. Additionally, you can now find Snapchat’s Ads Gallery in Snap’s transparency hub, accessible directly via the navigation bar.
Total Ads Reported
Total Ads Removed
43,098
17,833


