Released: 30 January 2026
Last Updated: 31 January 2026
This Transparency Report is published in accordance with Articles 7(2) and 7(3) of Regulation 2021/784 of the European Parliament and of the Council of the EU, addressing the dissemination of terrorist content online (the Regulation). It covers the reporting period starting on 1 January 2025 and ending on 31 December 2025.
General Information
Article 7(3)(a): information about the hosting service provider’s measures in relation to the identification and removal of or disabling of access to terrorist content
Article 7(3)(b): information about the hosting service provider’s measures to address the reappearance online of material which has previously been removed or to which access has been disabled because it was considered to be terrorist content, in particular where automated tools have been used
Snapchat experiences a very low incidence of terrorist content, and did not receive a removal order under the Regulation in 2025.
Terrorists, terrorist organizations, and violent extremists are prohibited from using Snapchat. Content that advocates, promotes, glorifies, or advances terrorism or other violent, criminal acts is prohibited under our Community Guidelines. Users are able to report content that violates our Community Guidelines via our in-app reporting menu and our Support Site. We also use proactive detection to attempt to identify violative content on public surfaces like Spotlight and Discover.
Irrespective of how we may become aware of violating content, our safety teams, through a combination of automation and human moderation, promptly review identified content and make enforcement decisions. Enforcements may include removing the content, warning or disabling the violating account, and, if warranted, reporting the account to law enforcement. To prevent the reappearance of terrorist or other violent extremist content on Snapchat, in addition to working with law enforcement, we take steps to block the device associated with the violating account and prevent the user from creating another Snapchat account.
Additional details regarding our measures for identifying and removing terrorist content can be found in our Explainer on Hateful Content, Terrorism, and Violent Extremism and our Explainer on Moderation, Enforcement, and Appeals.
Reports & Enforcements
Article 7(3)(c): the number of items of terrorist content removed or to which access has been disabled following removal orders or specific measures, and the number of removal orders where the content has not been removed or access to which has not been disabled pursuant to the first subparagraph of Article 3(7) and the first subparagraph of Article 3(8), together with the grounds therefor
During the reporting period, Snap did not receive any removal orders, nor were we required to implement any specific measures pursuant to Article 5 of the Regulation. Accordingly, we were not required to take enforcement action under the Regulation.
The following table describes enforcement actions taken based on user reports and proactive detection against content and accounts, both in the EU and elsewhere around the world, that violated our Community Guidelines relating to terrorism and violent extremism content.
Policy Reason
Total Content & Account Reports
Total Enforcements
Total Unique Accounts Enforced
Terrorism & Violent Extremism
511,176
32,948
21,895
Article 7(3)(d): the number and the outcome of complaints handled by the hosting service provider in accordance with Article 10
Article 7(3)(g): the number of cases in which the hosting service provider reinstated content or access thereto following a complaint by the content provider
Because we had no enforcement actions required under the Regulation during the reporting period as noted above, we handled no complaints pursuant to Article 10 of the Regulation and had no associated reinstatements.
The following table contains information relating to appeals and reinstatements, both in the EU and elsewhere around the world, involving terrorist and violent extremist content enforced under our Community Guidelines.
Policy Reason
Total Appeals
Total Reinstatements
Total Decisions Upheld
Terrorism & Violent Extremism
558
22
536
Article 7(3)(e): the number and the outcome of administrative or judicial review proceedings brought by the hosting service provider
Article 7(3)(f): the number of cases in which the hosting service provider was required to reinstate content or access thereto as a result of administrative or judicial review proceedings
Because we had no enforcement actions required under the Regulation during the reporting period as noted above, we had no associated administrative or judicial review proceedings, and we were not required to reinstate content as a result of any such proceedings.