Privacy, Safety, and Policy Hub

Our Work To Help Keep Snapchatters Safe

October 4, 2024

At Snap, we continually evolve our safety mechanisms and platform policies to limit the ability for bad actors to misuse our platform. We leverage advanced technology to detect and block activity that violates our rules, we apply design principles to create friction in the friending process, we support law enforcement, collaborate with government authorities, and work to raise awareness and provide education regarding the most serious harms impacting teens, and indeed all members of our community. 

We take our responsibility to protect teens very seriously. Our work is substantial, including the following: 

I. Making Snapchat a hostile environment for bad actors

Earlier this year, we announced new features to help further safeguard our community, specifically teenage users, and to reinforce the real-world relationships that make Snapchat unique. Those updates include: expanded in-app warnings for suspicious contacts, enhanced friending protections built specifically for teens, and improving the ability to block unwanted contact. 

These changes, focused on combating online sextortion, build upon our ongoing investments to fight back against all forms of child sexual exploitation and abuse. For example: 

We use signals to identify sextortion behavior so that we can actively remove bad actors prior to them having the opportunity to target and victimize others. This is in addition to leveraging and deploying technology designed to prevent the spread of known child sexual exploitation and abuse imagery (CSEAI) on Snapchat, including PhotoDNA (to detect duplicates of known illegal images), CSAI Match (to detect duplicates of known known illegal videos), and Content Safety API (to aid in detecting novel, “never-before-hashed” imagery).

While we have long offered simple in-app reporting of content and accounts that violate our rules, in 2023 we made improvements to bolster our fight against sextortion-related harms. Last year we launched in-app chat text reporting – which enables Snapchatters to report individual messages directly from the conversation itself. We also expanded our in-app reporting tools to include a specific, tailored reporting reason for sextortion and, on advice and guidance from the CSEA-fighting NGO Thorn, presented that reporting option in relatable language for teens and young people (“They leaked / are threatening to leak my nudes”). In turn, those reports are used to inform our enforcement efforts, including signal-based detection and enforcement. We analyze trends, patterns, and sextortionists’ techniques and, if an account exhibits certain characteristics, it is locked for sextortion.

We continue to improve and add to our Family Center suite of tools, released in 2022, where parents can see who their teen is friends with on Snapchat, who they’ve chatted with recently, and easily report accounts that may be of concern to them. Our goal for Family Center has always been to prompt open and constructive dialogue among parents/caregivers and teens about staying safe online. 

We collaborate closely with law enforcement, and invest heavily in our safety and law enforcement operations teams that operate around the world 24/7 to help keep our community safe. For example, our Trust and Safety teams have more than doubled in size since 2020, and our Law Enforcement Operations team has more than tripled in that time. We host annual summits for law enforcement in the U.S. to ensure officers and agencies know how to take appropriate action against any illegal activity that may be taking place on our platform. 

We also are working directly with law enforcement in Nigeria – where many sextortion cases originate – to build capacity and knowledge to investigate and prosecute perpetrators, and we plan to continue our engagement with the Nigerian government to further collaborate in this area. And we have worked with the International Justice Mission, NCMEC, other members of industry, and NGOs to provide training to law enforcement on investigating CyberTips in countries outside the U.S. where sextortion activity is prevalent.

For many years, we have also engaged with a robust cadre of “trusted flaggers”: non-profits, NGOs, and select government agencies that escalate to us abuse cases, imminent threats to life, and other emergency issues on behalf of Snapchatters in need via high-priority channels. The majority of participants in our Trusted Flagger Program report content and accounts relating to sexual-related harms against minors, including sextortion.  

II. Engaging industry experts and coalitions 

On top of our own investments, we also engage with experts across the globe, because we know that no one entity or organization alone can meaningfully advance these issues. For example, Snap represents industry on the international Policy Board of the WeProtect Global Alliance; we are members of INHOPE’s Advisory Council, and sit on the UK Internet Watch Foundation’s (IWF) Board of Trustees. All of these organizations have the eradication of online CSEA at the heart of their missions.

We remain active members of the Tech Coalition, a global industry alliance focused on combating online child sexual exploitation and abuse, and recently concluded a two-year term on its Board of Directors’ Executive Committee. We were also founding members of the Tech Coalition’s Lantern initiative, the first cross-platform signal-sharing program for companies to strengthen how they enforce their child safety policies. Through active participation in this program, companies can help each other cross check for bad actors, including sextortionists.   

Additionally, we leverage the National Center for Missing and Exploited Children’s (NCMEC) Take It Down database, which allows minors to generate a digital fingerprint–called a “hash”–of selected images or videos directly on their devices. Participating companies, including Snap, can then use those hashes to look for and remove duplicate imagery violating our Community Guidelines. We participate in a similar program in the UK called Report Remove, and last year, we joined SWGfL’s StopNCII collaboration to help prevent the spread of non-consensual intimate imagery (NCII) on Snapchat by leveraging that group’s hash database. StopNCII helps stop the spread of intimate imagery of those 18+, and offers victims the ability to reclaim their privacy over their most private photos and videos.

This year we also launched our inaugural Snap Council for Digital Well-Being, a group of 18 teens from across the U.S., selected to participate in a year-long pilot program championing safer online habits and practices in their schools and communities. In July, this group, alongside at least one parent or chaperone, gathered at Snap headquarters in Santa Monica, California for discussions that yielded interesting insights into topics like online pitfalls and social dynamics, and parental tools. This is in addition to Snap’s Safety Advisory Board, consisting of 16 professionals and three youth advocates, who provide direct guidance and direction to Snap on safety matters. We look forward to 2025, where we hope to create more opportunities for members of both groups to come together and unearth additional insights. 

III. Driving awareness 

Beyond our internal investments and the work we do with experts and cross-industry, a critical component to fighting online sexual abuse and sextortion schemes is raising public and Snapchatter awareness. 

In 2022, we launched the Digital Well-Being Index, industry-leading research offering insight into how teens and young adults are faring online across all platforms. As part of this research, for the past two years, we have done deep-dives into sextortion. Because this study covers experiences online generally, not just on Snapchat, it not only helps to inform our work, but we hope it offers insights to others across the tech ecosystem, as well. Later this month, we will share results of our Year Two sextortion deep-dive, in conjunction with the Technology Coalition’s upcoming virtual Multi-Stakeholder Forum on the financial sextortion of minors. 

Earlier this year, we were honored to be the first entity to support Know2Protect, a first-of-its-kind public awareness campaign launched earlier this year by the U.S. Department of Homeland Security (DHS). The campaign educates and empowers young people, parents, trusted adults, and policymakers to help prevent and combat crimes such as financial sextortion. In addition to donating advertising space on Snapchat for K2P educational resources, we are conducting additional research with teens and young adults in the U.S. to further inform the campaign. We also co-launched an augmented reality Snapchat Lens to help educate Snapchatters through an interactive Know2Protect quiz. And in the UK, we supported the IWF’s broad-reaching public awareness-raising campaign around these issues to date, Gurls Out Loud, designed to inform 11-to-13-year-old girls about online sexual grooming, sexting and sending nudes. Additionally, we recently launched an Educator’s Guide to Snapchat, partnering with Safe and Sound Schools to develop a comprehensive toolkit for educators that includes information and guidance on how to combat sextortion. 

Based on our years of research, we know that awareness-raising and community education are powerful tools in helping to prevent online harms, and we continue to develop and promote in-app resources to reach teens and young people directly on Snapchat. In September 2023, we released four new in-appSafety Snapshot” episodes focusing on sexual risks and harms, including financial sextortion. We also offer episodes on sexting and the consequences of creating and sharing nudes, child online grooming for sexual purposes, and child sex trafficking. All of these resources were reviewed by experts at the National Center for Missing and Exploited Children (NCMEC) and created in collaboration with relevant hotlines and helplines in key geographies. 

We continue to fight back against online child sexual exploitation and abuse, but there is still more to do. In 2025, we will continue to raise awareness about potential online harms and continue to reiterate that anyone can be a potential target. We want to disrupt and thwart bad actors early and often, and we want to produce even more-actionable CyberTips for law enforcement.

It is important to note that we employ many of these same strategies as we fight back against other egregious harms, like illicit drug activity, including the sale of counterfeit pills, threats of violence, and suicide and self-harm content. We realize that our work in this space may never be done, but we care deeply about the safety of Snapchatters and will continue to work collaboratively across the industry, government, and law enforcement to exchange information and strengthen our defenses.

กลับไปสู่ข่าว