Welcome to our European Union (EU) transparency page, where we publish EU specific information required by the Digital Services Act (DSA), the Audiovisual Media Service Directive (AVMSD), the Dutch Media Act (DMA), and the Terrorist Content Online Regulation (TCO). Please note that the most up-to-date version of these Transparency Reports can be found in the en-US locale.
Snap Group Limited has appointed Snap B.V. as its Legal Representative for purposes of the DSA. You can contact the representative at dsa-enquiries [at] snapchat.com for the DSA, at vsp-enquiries [at] snapchat.com for AVMSD and DMA, at tco-enquiries [at] snapchat.com for TCO, through our Support Site [here], or at:
Snap B.V.
Keizersgracht 165, 1016 DP
Amsterdam, The Netherlands
If you are a law enforcement agency, please follow the steps outlined here.
Please communicate in English or Dutch when contacting us.
For DSA, we are regulated by the European Commission, and the Netherlands Authority for Consumers and Markets (ACM). For AVMSD and the DMA, we are regulated by the Dutch Media Authority (CvdM). For TCO, we are regulated by the Netherlands Authority for the prevention of Online Terrorist Content and Child Sexual Abuse Material (ATKM).
Released: 30 January 2026
Last Updated: 30 January 2026
This report covers the period starting on 1 January 2025 and ending on 31 December 2025.
Background
The sexual exploitation of any member of our community, especially minors, is illegal, abhorrent, and prohibited by our Community Guidelines. Preventing, detecting, and eradicating Child Sexual Exploitation and Abuse (CSEA) on our platform is a top priority for Snap, and we continually evolve our capabilities to combat these and other crimes.
The technologies we use include (a) PhotoDNA robust hash-matching and Google’s Child Sexual Abuse Imagery (CSAI) Match to identify known illegal images and videos of CSEA, respectively; and (b) Google’s Content Safety API to identify novel, "never-before-hashed" CSEA imagery. We report CSEA imagery to the U.S. National Center for Missing and Exploited Children (NCMEC), as required by law. NCMEC, in turn, coordinates with law enforcement in the U.S. and other countries, as needed.
Report
The chart below includes data about proactive detection and resulting enforcements against EU users for CSEA imagery during 2025. (Note: some enforcements resulting from proactive scanning in 2025 may still be subject to appeal as of when this report was compiled, and therefore would not be reflected in the appeals and reinstatements data below.)
EU Data
Totals
Total content identified as potential CSEA imagery through proactive scanning
189,057
Total enforcements for violation of Snap’s policies against CSEA
61,774
Total appealed
12,246
Total appeals reviewed
12,246
Total enforcements overturned following appeal
1,012
Rate of enforcements overturned following appeal (Total enforcements overturned / Total appeals reviewed)
8.26%
Detected CSEA retention period
730 days
1. This category reports enforcements for violations of Snap’s Community Guidelines prohibiting child sexual exploitation. Proactively detected content that is enforced for other violations of Snap’s Community Guidelines is not reported here.
2. An enforcement can be overturned if we determine it was erroneous based on our policies in effect at the time of enforcement, or if we determine it was correctly enforced originally, but our applicable policy has changed at the time of reviewing the appeal.
Content Moderation Safeguards
The safeguards applied for CSEA Media Scanning are set out in the above EU DSA Transparency Report.
Ullmhaíodh na tuarascálacha seo chun oibleagáidí Snap faoi Airteagal 34 agus Airteagal 35 De Rialachán (AE) 2022/2065 a chomhlíonadh agus chun torthaí ár measúnuithe bliantúla ar rioscaí córais a eascraíonn as dearadh, feidhm Agus úsáid bróga bonnarda Ar Líne Snapchat a sholáthar, mar aon leis an modheolaíocht a úsáidtear chun na rioscaí sin a mheas agus na bearta maolaithe a cuireadh i bhfeidhm chun aghaidh a thabhairt ar na rioscaí sin.
Ullmhaíodh na tuarascálacha sin chun oibleagáidí Snap faoi Airteagal 37 de Rialachán (AE) 2022/2065 a chomhlíonadh agus déantar foráil iontu: (i) torthaí an iniúchta neamhspleách ar chomhlíonadh Snap leis na hoibleagáidí a leagtar amach i gCaibidil III de Rialachán (AE) 2022/2065 agus (ii) na bearta a rinneadh chun na moltaí oibríochtúla ón iniúchadh neamhspleách sin a chur chun feidhme.
Is soláthraí “seirbhíse ardán-roinnte físeáin” (“VSP”) é Snap de réir Airteagal 1(1)(aa) den AVMSD. Ullmhaíodh an Cód Iompair (“Cód”) seo chun cur síos a dhéanamh ar an gcaoi a gcomhlíonann Snap a chuid oibleagáidí mar VSP faoi Acht Meán na hÍsiltíre (“DMA”) agus Treoir (AE) 2010/13 (mar atá leasaithe ag Treoir (AE) 2018/1808 (“Treoir na Seirbhísí Meán Closamhairc” nó “AVMSD”)). Tá an Cód infheidhmithe ar fud an Aontais Eorpaigh chomh maith leis an Limistéar Eorpach Eacnamaíoch.
Cód Iompair VSP an AE | Snapchat | Márta 2025 (PDF)
Bulgarian | Croatian | Czech | Danish | Dutch | Estonian | Finnish | French | German | Greek | Hungarian | Irish | Italian | Latvian | Lithuanian | Maltese | Polish | Portuguese | Romanian | Slovak | Slovenian | Spanish | Swedish