Snap Values

Our Continued Efforts to Thwart Online Child Sexual Exploitation 


Snap is deeply committed to the safety of the Snapchat community. Our aim is to help protect users from a range of online risks and potential harms that span the digital ecosystem, including abhorrent crimes involving child sexual exploitation and abuse (CSEA). Snap has been fighting back against this illegal content and vile criminal behavior for years, employing both proactive-detection and reactive-response measures across the Snapchat app. Over the past year, we’ve made additional changes to our related policies and processes with the intent of helping to bring perpetrators to justice. We’d like to share more about that work here.

What is CSEA?

Child sexual exploitation and abuse covers a range of illegal activity involving sexual harm to minors, including: the viewing, production, and distribution of child sexual exploitation and abuse imagery (CSEAI), including the newer threat of AI-generated photos and videos; the online grooming, or enticement, of minors for sexual purposes; 1 sexual extortion; 2 child sex trafficking; and the live-streaming of children being sexually abused and tortured. 3 It’s difficult to comprehend that these repulsive realities actually exist. But, sadly, such abuse always has – and the digital world now provides an alternate means of abuse, expanding the modes of offending to be faster and more far-reaching.    

“Technology has fundamentally changed the way those with sexual interest in children fantasize about them, access them, and prey upon them,” said Ernie Allen, Chair of the London-based WeProtect Global Alliance. “The New York Times wrote, ‘The crisis of online child sexual exploitation is at a breaking point.’  It noted that reports are escalating, offenders are sharing images of younger victims and more heinous crimes, and law enforcement is struggling to keep up. If you add new variations to that challenge, like sextortion, which the FBI reported is up 7,200%, live-streaming abuse and others, it is clear this problem should be at the top of policy agendas worldwide,” he added.  

Tech companies’ legal obligations

When U.S. technology companies become aware of CSEA on their platforms or services, they have a legal duty to report it to the National Center for Missing and Exploited Children (NCMEC). NCMEC then reviews these CyberTips and liaises with both domestic and international law enforcement to help safeguard child victims and punish offenders.         

Up until last year, the annual totals of CyberTips to NCMEC had largely been rising year over year, topping out at a record 36.2 million reports 4 in 2023, up from just over 32 million in 2022, and 4,560 in 1998 when NCMEC’s CyberTipline was created. It was this record rise in CyberTips — in addition to other factors, including a continued spike in sexual extortion globally and feedback that we received from law enforcement about the actionability of our own CyberTips — that prompted Snap to first approach NCMEC in 2024 to recalibrate our CyberTip reporting.

I’ve monitored CyberTip reporting over several years, and I firmly believe that the practice of some technology companies has trended toward reporting everything to NCMEC — even over and above their legal obligation — for fear of being accused of missing something. 5 Companies that fail to report as required by law may face stiff fines. But this approach to reporting, coupled with the low actionability of many reports (either because CyberTips lack sufficient context for law enforcement to take action or because the conduct reported is, in some instances, not a priority for them), has resulted in mounting CyberTips; multiple, extensive reviews; and increased frustration among law enforcement.

Goal:  High-value CyberTips

The goal of our recalibration with NCMEC was to further increase the actionability and value of Snap’s CyberTips by refining our own platform policies and the protocols we have in place for reporting to NCMEC. That initial recalibration took place in May of 2024, and our policy changes became fully effective in the latter part of last year. 

While we should avoid putting too much emphasis on the pure number of reports in isolation, since implementing our policy refinements, Snap is pleased to share that we’ve seen a significant drop in our overall reporting volume to NCMEC and a similar decline in escalations for online enticement – all while still fulfilling our legal obligations to report CSEA in CyberTips. The lower totals have yet to filter through to NCMEC’s annual reporting volumes, however, because the changes were implemented later in 2024, leaving only a few full months of the year to reflect the amendments. 

In parallel, our team has been working to improve the data and labeling in our CyberTips. Such clarifications can help to better identify the alleged crimes and individuals involved and, in turn, allow for quicker and more accurate triaging of reports by NCMEC and law enforcement. Even when reporting to NCMEC may not be warranted, Snap will continue to address imagery and behaviors that contravene our platform policies by taking action – which can include removing the violating content, disabling the account, and seeking to block the creation of new accounts.    

Recalibration checkpoint 

Last month, we held a second recalibration with NCMEC, where the Snap team recapped last year’s internal policy changes; updated NCMEC on our progress; confirmed some additional operating guidelines; and helped to set expectations for the months to come. We also previewed some further planned policy and reporting additions that we hope will continue to enhance the value of our reports to both NCMEC and law enforcement. Naturally, this is an iterative process, but we are encouraged by the progress over the past year.     

Looking ahead

Over time, we expect that NCMEC and law enforcement may continue to see reduced volumes of CyberTips from Snap as compared to previous years, but a greater percentage of the ones they do receive should contain higher-quality information and be more actionable. 

We are committed to continuous improvement in all aspects of our safety work. We don’t pretend to have all the answers with respect to fighting CSEA or any other potential online harm; no one should because no one does. Day in and day out, we are battling whole-of-society issues and we bring a whole-of-company approach to bear – with the ultimate twin objectives of mitigating risk and reducing harm. 

With respect to our CyberTips, we have a distinct goal for the not-too-distant future: Whether it be a state Internet Crimes Against Children Task Force (ICAC) commander, a federal investigator, or a member of law enforcement halfway around the world, we want state, federal, and international authorities to confidently pursue CyberTips from Snap because they know our reports can be consistently relied upon as actionable, valuable, and thorough. And, our hope is that those efforts will help prompt a corresponding uptick in related arrests 6 and convictions the world over. 

A huge “thank you” to NCMEC, WeProtect, the 61 ICAC Task Forces in the U.S., law enforcement around the world, and all of our colleagues and collaborators across the CSEA-fighting landscape – and an additional note that we remain open to, and hungry for, your feedback to further improve our strategies and approaches. We are all in this together. 

- Jacqueline Beauchere, Global Head of Platform Safety