Snapchat Moderation, Enforcement, and Appeals
Community Guidelines Explainer Series
Updated: March 2026
Across Snapchat, we’re committed to advancing safety while respecting the privacy interests of our community. We take a balanced, risk-based approach to combating potential harms — combining transparent content moderation practices, consistent and equitable enforcement, and clear communication to hold ourselves accountable for applying our policies fairly.
Content Moderation
We’ve designed Snapchat with safety in mind, and this design is key in helping to prevent the spread of potentially harmful content. For example, Snapchat does not offer an open news feed where creators have an opportunity to broadcast potentially harmful or violating content, and friends lists are private.
In addition to these design safeguards, we use a combination of automated tools and human review to moderate our public content surfaces (such as Spotlight, Public Stories and Maps). Content that is recommended on public surfaces is also held to a higher standard and must meet additional guidelines. On Spotlight, for example, creators can submit creative and entertaining videos to share with the broader Snapchat community, all content is first reviewed via automation by artificial intelligence and other technology before gaining any distribution. Before a Snap is given the opportunity to be recommended for distribution to a large audience, it is reviewed by human moderators. This layered approach to moderating content on Spotlight reduces the risk that potentially harmful content will spread, and helps promote a fun and positive experience for everyone.
Similarly, editorial content that has been produced by media companies, such as Publisher Stories or Shows, is held to higher standards for safety and integrity. Additionally, we use proactive harm-detection technology on other public or high-visibility surfaces––such as Stories––to help identify potentially harmful content, and we use keyword filtering to help prevent such content (for instance accounts trying to advertise illicit drugs or other illegal items) from appearing in search results.
Reporting to Snapchat
Across all of our product surfaces, users can report accounts and content for potential violations of our policies. We make it easy for Snapchatters to submit a confidential report directly to our safety teams, who are trained to evaluate the report, take appropriate action according to our policies, and notify the reporting party of the outcome––typically within a matter of hours. (In some instances, if we are able to confidently make a high-precision moderation decision on a report using artificial intelligence technology, we may do that.)
If you see content that violates our Community Guidelines, please report it as soon as possible. When you report content in the app, we’re able to review and retain it, even if it would otherwise delete by default. For more information about reporting potentially harmful content or behavior, visit this resource on our Support Site. You can also learn more about efforts to identify and take down violating content, and promote safety and well-being on Snapchat, here. If you have a question or concern about the outcome of a report you’ve submitted, you can follow up via our Support Site.
When you submit a report, you are attesting that it is complete and accurate to the best of your knowledge. Please do not abuse Snap’s reporting systems, including by repeatedly sending duplicative or otherwise “spammy” reports. If you engage in this behavior, we reserve the right to deprioritize review of your reports. If you frequently submit unfounded reports against others’ content or accounts, we may, after sending you a warning, suspend review of your reports for up to one year and, in egregious situations, may disable your account.
Please note that we have a dedicated reporting channel for principals and similar administrators of K-12 schools located in California and mental health professionals providing services to California-based minors, as required by California Business and Professions Code, Division 8, Chapter 22.2.8. Information about these reporting procedures can be found here.
Policy Enforcement at Snap
It’s important to us at Snap that our policies promote consistent and fair enforcement. We consider context, the severity of the harm, and the history of the account to determine the appropriate penalties for violations of our Community Guidelines.
Our approach is to promptly disable accounts that we determine have engaged in severe harms. Examples of severe harms include sexual exploitation or abuse of children, attempted distribution of illicit drugs, and promotion of violent extremist or terrorist activity.
Our policy is also to disable accounts created or used primarily to violate our Community Guidelines, even for less severe harms. For example, an account may be promptly disabled if it has a violating username or display name, or if it has posted multiple pieces of violating content.
For other violations of our Community Guidelines, Snap generally follows a three-part enforcement process:
Step one: the violating content is removed.
Step two: the Snapchatter receives a notification, indicating that they have violated our Community Guidelines, that their content has been removed, and that repeated violations will result in additional enforcement actions, including their account being disabled.
Step three: our team records a “strike” against the Snapchatter’s account.
A strike creates a record of violations by a particular Snapchatter. Strikes are accompanied by a notice to the Snapchatter. If a Snapchatter accrues too many strikes over a defined period of time, their account will be disabled. In addition, when a Snapchatter accrues one or more strikes, we may limit access to certain features on Snapchat or limit the public distribution of their content. This strike system helps ensure that we apply the Community Guidelines consistently, and in a way that provides warning and education to users.
When we receive reports alleging that content or accounts violate local laws, we first assess the content against Community Guidelines. If no violation is found, we then assess the content under the laws applicable to the posting user’s jurisdiction. Where content is determined to be illegal under the posting user’s local law, we will remove the content or take appropriate enforcement action, consistent with Snap’s commitment to respecting human rights. We may also employ a strike system, similar to what is described above. In cases where content is legal in the posting user’s jurisdiction but illegal or restricted in the reporting user’s jurisdiction, we may take localized enforcement measures, such as reducing the visibility of the content or account in the affected jurisdiction, even if the content is not removed globally.
Notice and Appeals Processes
To help Snapchatters have a clear understanding of why an enforcement action was taken against them and provide an opportunity to appeal, we have established Notice and Appeals processes that aim to safeguard the interests of our community while protecting Snapchatters’ rights.
We apply our Community Guidelines and Terms of Service when we evaluate whether to enforce penalties against an account, and apply our Community Guidelines, Terms of Service, and Content Guidelines for Recommendation Eligibility to moderate content that is broadcasted or recommended. For information about how our appeals processes work, we have developed support articles on account appeals and content appeals. Please note that all Snapchat appeals related to violations of the Community Guidelines are conducted by a human reviewer. When Snapchat grants an appeal of an account lock, access to the Snapchatter’s account will be restored. Whether or not the appeal is successful, we notify the appealing party of our decision in a timely manner.
Please do not abuse Snap’s appeals mechanism by repeatedly submitting requests about your appeal. If you engage in this behavior, we reserve the right to deprioritize review of your requests. If you frequently submit unfounded appeals, we may, after sending you a warning, suspend review of your appeals (including related requests) for up to one year.