Snap Values
Transparency Report
January 1, 2025 – June 30, 2025

Released:

December 1, 2025

Updated:

December 1, 2025

We publish this transparency report twice a year to provide insights into Snap’s safety efforts. As part of our commitment to safety and transparency, we continually strive to make these reports more comprehensive and informative for the many stakeholders who care deeply about our content moderation, our approach to law enforcement, and the safety and well-being of the Snapchat community. 

This Transparency Report covers the first half of 2025 (January 1 - June 30). We share global data about reports by users and proactive detection by Snap; enforcements by our Safety teams across specific categories of Community Guidelines violations; how we responded to requests from law enforcement and governments; and how we responded to notices of copyright and trademark infringement. We also provide country-specific insights in a series of linked pages.

To find additional safety and privacy resources on Snapchat, see our About Transparency Reporting tab at the bottom of the page.

Please note that the most up-to-date version of this Transparency Report is the English version.

Overview of Our Trust & Safety Teams' Actions to Enforce Our Community Guidelines

Our Safety teams enforce our Community Guidelines both proactively (through the use of automated detection tools) and reactively (in response to reports), as further detailed in the following sections of this report. In this reporting cycle (H1 2025), our Safety teams took the following numbers of enforcements:

Total Enforcements

Total Unique Accounts Enforced

9,674,414

5,794,201

Below is a breakdown per type of Community Guidelines violations concerned, including the median “turnaround time” between the time we detected the violation (either proactively or upon receipt of a report) and the time we took final action on the relevant content or account:

Policy Reason

Total Enforcements

Total Unique Accounts Enforced

Median Turnaround Time (minutes) From Detection To Final Action

Sexual Content

5,461,419

3,233,077

1

Child Sexual Exploitation & Abuse

1,095,424

733,106

5

Harassment and Bullying

713,448

594,302

3

Threats & Violence

187,653

146,564

3

Self-Harm & Suicide

47,643

41,216

5

False Information

2,088

2,004

1

Impersonation

7,138

6,881

<1

Spam

267,299

189,344

1

Drugs

1,095,765

726,251

7

Weapons

251,243

173,381

1

Other Regulated Goods

183,236

126,952

4

Hate Speech

343,051

284,817

6

Terrorism & Violent Extremism

10,970

6,783

2

The Total Enforcements data includes enforcements taken by Snap after review of in-app reports submitted through Snapchat.  This represents the vast majority of enforcements made by Snap’s Safety teams. This number excludes most enforcements made as a result of investigations based on reports made to Snap via our Support Site or other mechanisms (e.g., via email), or as a result of some proactive investigations undertaken by our Safety teams. These excluded enforcements represented less than 0.5% of enforcement volume in the first half of 2025.

During the reporting period, we saw a Violative View Rate (VVR) of 0.01 percent, which means that out of every 10,000 Snap and Story views on Snapchat, 1 contained content found to violate our Community Guidelines. Among enforcements for what we consider to be “Severe Harms,” we saw a VVR of 0.0003% percent. A breakdown of VVR by policy reason is provided in the table below.

Policy Reason

VVR

Sexual Content

0.00482%

Child Sexual Exploitation & Abuse

0.00096%

Harassment and Bullying

0.00099%

Threats & Violence

0.00176%

Self-Harm & Suicide

0.00009%

False Information

0.00002%

Impersonation

0.00009%

Spam

0.00060%

Drugs

0.00047%

Weapons

0.00083%

Other Regulated Goods

0.00104%

Hate Speech

0.00025%

Terrorism & Violent Extremism

0.00002%

Community Guidelines Violations Reported to Our Trust & Safety Teams

From January 1 - June 30, 2025, in response to 19,766,324 in-app reports of violations of our Community Guidelines, Snap’s Safety teams took a total of 6,278,446 enforcement actions globally, including enforcements against 4,104,624 unique accounts.  This in-app reporting volume excludes support site and email reports, which comprise less than 1% of total reporting volume.  The median turnaround time for our Safety teams to take enforcement action in response to those reports was ~2 minutes. A breakdown per policy reason is provided below. (Note: In prior reports, we sometimes referred to this as the "reporting category."  Going forward, we are using the term "policy reason,"  which we feel more accurately reflects the nature of the data – because our Safety teams strive to enforce according to the appropriate policy reason, regardless of the reporting category identified by the person submitting the report.)


Total Content & Account Reports

Total Enforcements

Total Unique Accounts Enforced

Total

19,766,324

6,278,446

4,104,624

Policy Reason

Total Content & Account Reports

Total Enforcements

Percentage of the Total Reports Enforced by Snap

Total Unique Accounts Enforced

Median Turnaround Time (minutes) From Detection To Final Action

Sexual Content

7,315,730

3,778,370

60.2%

2,463,464

1

Child Sexual Exploitation & Abuse

1,627,097

695,679

11.1%

577,736

10

Harassment & Bullying

4,103,797

700,731

11.2%

584,762

3

Threats & Violence

997,346

147,162

2.3%

120,397

2

Self-Harm & Suicide

350,775

41,150

0.7%

36,657

3

False Information

606,979

2,027

0.0%

1,960

1

Impersonation

745,874

7,086

0.1%

6,837

<1

Spam

1,709,559

122,499

2.0%

94,837

1

Drugs

481,830

262,962

4.2%

176,799

5

Weapons

271,586

39,366

0.6%

32,316

1

Other Regulated Goods

530,449

143,098

2.3%

98,023

3

Hate Speech

817,262

337,263

5.4%

280,682

6

Terrorism & Violent Extremism

208,040

1,053

0.0%

912

2

In the first half of 2025, we continued to reduce median turnaround times across all policy categories, cutting them by an average of more than 75% as compared to the prior reporting period, to 2 minutes. This reduction was due in large part to a continued concerted effort to improve our prioritization of reports for review based on severity of harm and automated review.

We also made several targeted changes to our safety efforts in the reporting period, which had an impact on the data reported here, including strengthening our policies on illicit activity involving weapons. We observed an increase in reports and enforcements in the Child Sexual Exploitation category, which was driven primarily by an increase in sexualized or sensitive content involving minors that violates our policies but is not illegal in the U.S. or subject to reporting to U.S. National Center for Missing and Exploited Children (NCMEC). The increase of volume related to Sexual Content (and decrease of volume related to Harassment) was driven by our reclassification of Sexual Harassment-associated content from Harassment to Sexual Content.

Our Efforts to Proactively Detect and Enforce Against Violations of Our Community Guidelines

Proactive Detection & Enforcement of our Community Guidelines


We use automated tools to proactively detect and, in some cases, enforce against violations of our Community Guidelines. These tools include hash-matching technology (including PhotoDNA and Google’s Child Sexual Abuse Imagery (CSAI)), Google’s Content Safety API, and other propreitary  technology designed to detect illegal and violative text and media, sometimes leveraging artificial intelligence and machine learning. Our proactive detection numbers routinely fluctuate as the result of user behavior changes, improvements to our detection capabilities, and changes to our policies.

In the first half of 2025, we took the following enforcement actions after proactively detecting violations of our Community Guidelines using automated detection tools:


Total Enforcements

Total Unique Accounts Enforced

Total

3,395,968

1,709,224

Policy Reason

Total Enforcements

Total Unique Accounts Enforced

Median Turnaround Time (minutes) From Detection To Final Action

Sexual Content

1,683,045

887,059

0

Child Sexual Exploitation & Abuse

399,756

162,017

2

Harassment and Bullying

12,716

10,412

8

Threats & Violence

40,489

27,662

6

Self-Harm & Suicide

6,493

4,638

7

False Information

61

44

20

Impersonation

52

44

34

Spam

144,800

96,500

0

Drugs

832,803

578,738

7

Weapons

211,877

144,455

0

Other Regulated Goods

40,139

31,408

8

Hate Speech

5,788

4,518

6

Terrorism & Violent Extremism

9,917

5,899

5

Combatting Child Sexual Exploitation & Abuse

Sexual exploitation of any member of our community, especially minors, is illegal, abhorrent, and prohibited by our Community Guidelines. Preventing, detecting, and eradicating CSEA on our platform is a top priority for Snap, and we continually evolve our capabilities to combat these and other crimes.

We use active technology detection tools to help identify CSEA-related content. These tools include hash-matching tools (including PhotoDNA and Google’s CSAI Match, to identify known illegal images and videos of CSEA, respectively) and Google’s Content Safety API (to identify novel, "never-before-hashed" illegal imagery). In addition, in some cases, we use behavioral signals to enforce against other suspected CSEA activity. We report CSEA-related content to the U.S. National Center for Missing and Exploited Children (NCMEC), as required by law. NCMEC then coordinates with domestic or international law enforcement, as necessary.

In the first half of 2025, we took the following actions upon identifying CSEA on Snapchat (either proactively or upon receiving a report):

Total Content Enforced

Total Accounts Disabled

Total Submissions to NCMEC*

994,337

187,387

321,587

*Note that each submission to NCMEC can contain multiple pieces of content. The total individual pieces of media submitted to NCMEC is equal to our total content enforced.

Our Efforts to Provide Resources and Support to Snapchatters in Need

Snapchat empowers friends to help each other in difficult times by providing resources and support for Snapchatters in need. 

Our Here For You search tool provides resources from experts when users search for certain topics related to mental health, anxiety, depression, stress, suicidal thoughts, grief and bullying. We have also developed a page dedicated to combating financially motivated sexual extortion and other sexual risks and harms, in an effort to support those in distress.

When our Safety teams become aware of a Snapchatter in distress, they are equipped to provide  self-harm prevention and support resources, and to notify emergency services, as needed. The resources we share are available on our global list of safety resources, which is publicly available to all Snapchatters at our Privacy, Safety & Policy Hub.

Total Times Suicide Resources Shared

36,162

Appeals

Below we provide information about appeals we received from users requesting a review of our decision to lock their account for Community Guidelines violations in the first half of 2025:

Policy Reason

Total Appeals

Total Reinstatements

Total Decisions Upheld

Median Turnaround Time (days) to Process Appeals

Total

437,855

22,142

415,494

1

Sexual Content

134,358

6,175

128,035

1

Child Sexual Exploitation & Abuse*

89,493

4,179

85,314

<1

Harassment and Bullying

42,779

281

42,496

1

Threats & Violence

3,987

77

3,909

1

Self-Harm & Suicide

145

2

143

1

False Information

4

0

4

1

Impersonation

1,063

33

1,030

<1

Spam

13,730

3,140

10,590

1

Drugs

128,222

7,749

120,409

1

Weapons

10,941

314

10,626

1

Other Regulated Goods

9,719

124

9,593

1

Hate Speech

3,310

67

3,242

1

Terrorism & Violent Extremism

104

1

103

1

Regional & Country Overview

This section provides an overview of our Safety teams’ actions to enforce our Community Guidelines, both proactively and in response to in-app reports of violations, in a sampling of geographic regions. Our Community Guidelines apply to all content on Snapchat—and to all Snapchatters—across the globe, regardless of location.

Information for individual countries, including all EU Member States, is available for download via the attached CSV file.



Overview of Our Safety Teams’ Actions to Enforce Our Community Guidelines 

Region

Total Enforcements

Total Unique Accounts Enforced

North America

3,468,315

2,046,888

Europe

2,815,474

1,810,223

Rest of World

3,390,625

1,937,090

Total

9,674,414

5,794,201

Community Guidelines Violations Reported to Our Safety Teams

Region

Content & Account Reports

Total Enforcements

Total Unique Accounts Enforced

North America

5,762,412

2,125,819

1,359,763

Europe

5,961,962

2,144,828

1,440,907

Rest of World

8,041,950

2,007,799

1,316,070

Total

19,766,324

6,278,446

4,116,740

Proactive Detection and Enforcement of Our Community Guidelines

Region

Total Enforcements

Total Unique Accounts Enforced

North America

1,342,496

785,067

Europe

670,646

422,012

Rest of World

1,382,826

696,364

Total

3,395,968

1,709,224

Ads Moderation

Snap is committed to ensuring that all ads are fully compliant with our advertising policies. We believe in a responsible approach to advertising, creating a safe experience for all Snapchatters. All ads are subject to our review and approval. In addition, we reserve the right to remove ads, including in response to user feedback, which we take seriously. 


Below we have included insight into our moderation for paid advertisements that are reported to us following their publication on Snapchat. Note that ads on Snapchat can be removed for a variety of reasons as outlined in Snap’s Advertising Policies, including deceptive content, adult content, violent or disturbing content, hate speech, and intellectual property infringement. Additionally, you can find Snapchat’s Ads Gallery on values.snap.com under the “Transparency” tab.

Total Ads Reported

Total Ads Removed

67,789

16,410