Snapchat Moderation, Enforcement, and Appeals
Community Guidelines Explainer Series
Updated: March 2025
Across Snapchat, we’re committed to advancing safety while respecting the privacy interests of our community. We take a balanced, risk-based approach to combating potential harms — combining transparent content moderation practices, consistent and equitable enforcement, and clear communication to hold ourselves accountable for applying our policies fairly.
Content Moderation
We’ve designed Snapchat with safety in mind, and this design is key in helping to prevent the spread of potentially harmful content. For example, Snapchat does not offer an open news feed where creators have an opportunity to broadcast potentially harmful or violating content, and friends lists are private.
In addition to these design safeguards, we use a combination of automated tools and human review to moderate our public content surfaces (such as Spotlight, Public Stories and Maps). Content that is recommended on public surfaces is also held to a higher standard and must meet additional guidelines. On Spotlight, for example, where creators can submit creative and entertaining videos to share with the broader Snapchat community, all content is first reviewed automatically by artificial intelligence and other technology before gaining any distribution. Once content gains more viewership, it’s then reviewed by human moderators before it is given the opportunity to be recommended for distribution to a large audience. This layered approach to moderating content on Spotlight reduces the risk that potentially harmful content will spread, and helps promote a fun and positive experience for everyone.
Similarly, editorial content that has been produced by media companies, such as Publisher Stories or Shows, is held to higher standards for safety and integrity. Additionally, we use proactive harm-detection technology on other public or high-visibility surfaces––such as Stories––to help identify potentially harmful content, and we use keyword filtering to help prevent such content (for instance accounts trying to advertise illicit drugs or other illegal items) from appearing in search results.
Across all of our product surfaces, users can report accounts and content for potential violations of our policies. We make it easy for Snapchatters to submit a confidential report directly to our safety teams, who are trained to evaluate the report, take appropriate action according to our policies, and notify the reporting party of the outcome––typically within a matter of hours. For more information about reporting potentially harmful content or behavior, visit this resource on our Support Site. You can also learn more about efforts to identify and take down violating content, and promote safety and well-being on Snapchat, here. If you have a question or concern about the outcome of a report you’ve submitted, you can follow up via our Support Site.
When you submit a report, you are attesting that it is complete and accurate to the best of your knowledge. Please do not abuse Snap’s reporting systems, including by repeatedly sending duplicative or otherwise “spammy” reports. If you engage in this behavior, we reserve the right to deprioritize review of your reports. If you frequently submit unfounded reports against others’ content or accounts, we may, after sending you a warning, suspend review of your reports for up to one year and, in egregious situations, may disable your account.
Policy Enforcement at Snap
It’s important to us at Snap that our policies promote consistent and fair enforcement. We consider context, the severity of the harm, and the history of the account to determine the appropriate penalties for violations of our Community Guidelines.
We promptly disable accounts that we determine have engaged in severe harms. Examples of severe harms include sexual exploitation or abuse of children, attempted distribution of illicit drugs, and promotion of violent extremist or terrorist activity.
We also disable accounts created or used primarily to violate our Community Guidelines, even for less severe harms. For example, an account that posts violating content and has a violating username or display name may be promptly disabled.
For other violations of our Community Guidelines, Snap generally follows a three-part enforcement process:
Step one: the violating content is removed.
Step two: the Snapchatter receives a notification, indicating that they have violated our Community Guidelines, that their content has been removed, and that repeated violations will result in additional enforcement actions, including their account being disabled.
Step three: our team records a “strike” against the Snapchatter’s account.
A strike creates a record of violations by a particular Snapchatter. Strikes are accompanied by a notice to the Snapchatter. If a Snapchatter accrues too many strikes over a defined period of time, their account will be disabled. This strike system helps ensure that we apply the Community Guidelines consistently, and in a way that provides warning and education to users.
Notice and Appeals Processes
To help Snapchatters have a clear understanding of why an enforcement action was taken against them and provide an opportunity to appeal, we have established Notice and Appeals processes that aim to safeguard the interests of our community while protecting Snapchatters’ rights.
We apply our Community Guidelines and Terms of Service when we evaluate whether to enforce penalties against an account, and apply our Community Guidelines, Terms of Service, and Content Guidelines for Recommendation Eligibility to moderate content that is broadcasted or recommended. For information about how our appeals processes work, we have developed support articles on account appeals and content appeals. When Snapchat grants an appeal of an account lock, access to the Snapchatter’s account will be restored. Whether or not the appeal is successful, we notify the appealing party of our decision in a timely manner.
Please do not abuse Snap’s appeals mechanism by repeatedly submitting requests about your appeal. If you engage in this behavior, we reserve the right to deprioritize review of your requests. If you frequently submit unfounded appeals, we may, after sending you a warning, suspend review of your appeals (including related requests) for up to one year.