Sexual Content
Community Guidelines Explainer Series
Updated: February 2025
Overview
We strive to protect Snapchatters from unsolicited sexual content or abuse. Our policies prohibit sexual exploitation of any kind – including the sexual exploitation of children. We also prohibit sexual harassment and sharing, promoting or distributing sexually explicit content and conduct, including pornography, sexual nudity or offers of sexual services.
What you should expect
We prohibit the following sexual harms:
Any activity that involves sexual exploitation or abuse of a minor, including sharing child sexual exploitation or abuse imagery, grooming for sexual purposes, sexual extortion (sextortion) or the sexualisation of children. Never post, save, send, forward, distribute or ask for nude or sexually explicit content involving anyone under the age of 18 (this includes sending or saving such images of yourself). We report any child sexual exploitation that we’ve identified, including attempts to engage in such conduct, to the appropriate authorities, including the US National Center for Missing and Exploited Children (NCMEC), in line with legal requirements.
Any communication or behaviour that attempts to persuade, trick or coerce a minor with the intent of sexual abuse or exploitation, or which leverages fear or shame to keep a minor silent.
All other forms of sexual exploitation, including sex trafficking, sextortion and deceptive sexual practices, including efforts to coerce or entice users to provide nudes.
Producing, sharing or threatening to share non-consensual intimate imagery (NCII) – including sexual photos or videos taken or shared without permission, as well as “revenge porn” or behaviour that threatens to share, exploit or expose individuals’ intimate images or videos without their consent.
All forms of sexual harassment. This may include making unwanted advances, sharing graphic and unsolicited content, or sending obscene requests or sexual invitations to other users.
Promoting, distributing, or sharing pornographic content, including photos, videos or even highly realistic animation, drawings or other renderings of explicit sex acts, or nudity where the primary intention is sexual arousal.
Offers of sexual services, including both offline services (such as, for example, erotic massage) and online experiences (such as, for example, offering sexual chat or video services).
We do allow for non-sexual nudity in certain contexts, such as breastfeeding, medical procedures and other similar depictions.
Our efforts to protect users
We aim to consider both safety and privacy in our approach to protecting users. We use a combination of automated tools and human review intended to prevent users from being exposed to pornographic content or other sexual harms or exploitation on public surfaces (such as Spotlight, Public Stories, and Maps).
We use automated tools to help identify and remove certain known illegal child sexual exploitation photos and videos, including:
Microsoft’s PhotoDNA (to detect duplicates of known illegal images)
Google’s CSAI Match (to detect duplicates of known illegal videos)
Google’s Content Safety API (to aid in detecting novel, “never-before-hashed” imagery)
We use similar technology to help identify and remove certain non-consensual intimate imagery. We participate in the Take It Down program run by the National Center for Missing and Exploited Children (NCMEC), receiving and leveraging hashes of nude or partially nude imagery supplied by minors that they want to prevent from spreading online. We also participate in StopNCII, a similar program for people who are 18 or older, run by South West Grid for Learning (SWGfL) based in the UK. In addition, in some cases, we use behavioral “signals” to identify potentially illegal or harmful activity so that we can proactively remove bad actors and report them to authorities as appropriate.
When we become aware of sexually explicit or exploitative content, we act swiftly to remove the offending content, enforce against the violating account, and where appropriate, escalate to NCMEC and/or law enforcement. We work closely with NCMEC and law enforcement to maintain a feedback loop on the effectiveness of our policies.
Takeaway
Our goal is to foster a safe community where Snapchatters can express themselves, and we do not tolerate sexually explicit or exploitative content. If you ever feel uncomfortable, do not hesitate to reach out to a trusted person in your life, report violating content and block any offending users.
Up Next: Threats, Violence, & Harm