Snapchat Moderation, Enforcement, and Appeals

Community Guidelines Explainer Series

Updated: May 2024

Across Snapchat, we’re committed to advancing safety while respecting the privacy interests of our community. We take a balanced, risk-based approach to combating harms — combining transparent content moderation practices, consistent and equitable enforcement, and clear communication to hold ourselves accountable for applying our policies fairly.


Content Moderation


We’ve designed Snapchat with safety in mind, and this design is key in helping to prevent the spread of harmful content. Snapchat does not offer an open news feed where unvetted publishers or individuals have an opportunity to broadcast hate, misinformation, or violent content. 

In addition to these design safeguards, we use a combination of automated tools and human review to moderate our public content surfaces (such as Spotlight, Public Stories, and Maps)––including machine learning tools and dedicated teams of real people––to review potentially inappropriate content in public posts. 

On Spotlight, for example, where creators can submit creative and entertaining videos to share with the broader Snapchat community, all content is first reviewed automatically by artificial intelligence before gaining any distribution. Once a piece of content gains more viewership, it’s then reviewed by human moderators before it is given the opportunity to reach a large audience. This layered approach to moderating content on Spotlight reduces the risk of spreading misinformation, hate speech, or other potentially harmful content, in addition to promoting a fun, positive, and safe experience for everyone. 

Similarly, editorial content that has been produced by media companies, such as Publisher Stories or Shows, is subject to a set of content guidelines—which prohibit the spread of misinformation, hate speech, conspiracy theories, violence, and many other categories of harmful content, holding these partners to elevated standards for safety and integrity. Additionally, we use proactive harm-detection technology on other public or high-visibility surfaces––such as Stories––to help identify harmful content, and we use keyword filtering to help prevent harmful content (such as accounts trying to advertise illicit drugs or other illegal content) from returning in search results.  

Across all of our product surfaces, users can report accounts and content for potential violations of our Community Guidelines. We make it easy for Snapchatters to submit a confidential report directly to our Trust & Safety team, who are trained to evaluate the report; take appropriate action according to our policies; and notify the reporting party of the outcome––typically within a matter of hours. For more information about reporting harmful content or behavior, visit this resource on our Support Site. You can also learn more about efforts to identify and take down harmful content, and promote wellness and safety on Snapchat, here.

Please do not abuse Snap’s reporting systems by making repeated, unfounded reports against others’ content or accounts or repeatedly reporting content or accounts that are permissible under our Community Guidelines. If you submit multiple reports here engaging in this behavior, we will first give you a warning, but if it continues, we will deprioritize reviewing reports from you for 90 days.

Policy Enforcement @ Snap

It’s important to us at Snap that our policies promote consistent and fair enforcement. For this reason, we consider a combination of factors to determine the appropriate penalties for violations of the Community Guidelines. The most important of these factors are the severity of the harm and any relevant history by the Snapchatter of previous violations.

We apply a risk-based approach to distinguish the most severe harms from other types of violations that may not rise to the same level of seriousness. For information about our enforcement of severe harms, and the types of violations that fall into that category, we’ve developed this resource.

Accounts we determine are used primarily to violate our Community Guidelines or to perpetrate serious harms will immediately be disabled. Examples include accounts engaged in serious bullying or harassment, impersonation, fraud, promotion of extremist or terrorist activity, or otherwise using Snap to engage in illegal activity.

For other violations of our Community Guidelines, Snap generally applies a three-part enforcement process:

  • Step one: the violating content is removed.

  • Step two: the Snapchatter receives a notification, indicating that they have violated our Community Guidelines, that their content has been removed, and that repeated violations will result in additional enforcement actions, including their account being disabled.

  • Step three: our team records a strike against the Snapchatter’s account.

A strike creates a record of violations by a particular Snapchatter. Every strike is accompanied by a notice to the Snapchatter; if a Snapchatter accrues too many strikes over a defined period of time, their account will be disabled.

This strike system ensures that Snap applies its policies consistently, and in a way that provides warning and education to users who violate our Community Guidelines. The primary goal of our policies is to ensure that everyone can enjoy using Snapchat in ways that reflect our values and mission; we have developed this enforcement framework to help support that goal at scale.  


Notice and Appeals Processes

To ensure that Snapchatters have a clear understanding of why an action has been taken against their account, and to provide an opportunity to meaningfully dispute the enforcement outcome, we have established Notice and Appeals processes that safeguard the interests of our community while protecting Snapchatters’ rights.

To better understand why an enforcement action has been taken, please note that we apply our Community Guidelines and Terms of Service when we evaluate whether to enforce penalties against an account, and apply our Community Guidelines, Terms of Service, and Content Guidelines for Recommendation Eligibility to moderate Snaps posted to Discover and Spotlight.

For information about how our appeals processes work, we have developed support articles on account appeals and content appeals.

When Snapchat grants an appeal of an account lock, access to the Snapchatter’s account will be restored. Whether or not the appeal is successful, we will notify the appealing party of our decision in a timely manner.