European Union
July 1, 2023 – December 31, 2023


25 April, 2024


25 April, 2024

Welcome to our European Union (EU) transparency page, where we publish EU specific information required by the Digital Services Act (DSA), the Audiovisual Media Service Directive (AVMSD) and the Dutch Media Act (DMA). Please note that the most up-to-date version of these Transparency Reports can be found in the en-US locale.

Legal Representative 

Snap Group Limited has appointed Snap B.V. as its Legal Representative for purposes of the DSA. You can contact the representative at dsa-enquiries [at] for the DSA, at vsp-enquiries [at] for AVMSD and DMA, through our Support Site [here], or at:

Snap B.V.
Keizersgracht 165, 1016 DP
Amsterdam, The Netherlands

If you are a law enforcement agency, please follow the steps outlined here.

Regulatory Authorities

For DSA, we are regulated by the European Commission, and the Netherlands Authority for Consumers and Markets (ACM). For AVMSD and the DMA, we are regulated by the Dutch Media Authority (CvdM)

DSA Transparency Report

Snap is required by Articles 15, 24 and 42 of the DSA to publish reports containing prescribed information regarding Snap’s content moderation for Snapchat’s services that are considered “online platforms,” i.e., Spotlight, For You, Public Profiles, Maps, Lenses and Advertising. This report must be published every 6 months, from 25 October 2023.

Snap publishes transparency reports twice a year to provide insight into Snap’s safety efforts and the nature and volume of content reported on our platform. Our latest report for H2 2023 (July 1- December 31) can be found here. Metrics specific to the Digital Services Act, can be found on this page.

Average Monthly Active Recipients 
(DSA Articles 24.2 and 42.3)

As of 31 December 2023, we have 90.9 million average monthly active recipients (“AMAR”) of our Snapchat app in the EU. This means that, on average over the last 6 months, 90.9 million registered users in the EU have opened the Snapchat app at least once during a given month.

This figure breaks down by Member State as follows:

These figures were calculated to meet current DSA rules and should only be relied on for DSA purposes. We have changed how we calculate this figure over time, including in response to changing internal policy, regulator guidance and technology, and figures are not intended to be compared between periods. This may also differ from the calculations used for other active user figures we publish for other purposes.

Member States Authority Requests
(DSA Article 15.1(a))

Takedown Requests 

During this period, we have received 0 takedown requests from EU member states pursuant to DSA Article 9. 

Information Requests 

During this period, we have received the following information requests from EU member states pursuant to DSA Article 10:

The median turnaround time to inform authorities of receipt of Information Requests is 0 minutes — we provide an automated response confirming receipt. The median turnaround time to give effect to Information Requests is ~10 days. This metric reflects the time period from when Snap receives an IR to when Snap believes the request is fully resolved. In some cases, the length of this process depends in part on the speed with which law enforcement responds to any requests for clarification from Snap necessary to process their request.

Content Moderation 

All content on Snapchat must adhere to our Community Guidelines and Terms of Service, as well as supporting terms, guidelines and explainers. Proactive detection mechanisms and reports of illegal or violating content or accounts prompt a review, at which point, our tooling systems process the request, gather relevant metadata, and route the relevant content to our moderation team via a structured user interface that is designed to facilitate effective and efficient review operations. When our moderation teams determine, either through human review or automated means, that a user has violated our Terms, we may remove the offending content or account, terminate or limit the visibility of the relevant account, and/or notify law enforcement as explained in our Snapchat Moderation, Enforcement, and Appeals Explainer.  Users whose accounts are locked by our safety team for Community Guidelines violations can submit a locked account appeal, and users can appeal certain content enforcements.

Content and Account Notices (DSA Article 15.1(b))

Snap has put into place mechanisms to allow users and non-users to notify Snap of content and accounts violating our Community Guidelines and Terms of Service on the platform, including those they consider illegal pursuant to DSA Article 16.  These reporting mechanisms are available in the app itself (i.e. directly from the piece of content) and on our website.

During the relevant period, we received the following content and account notices in the EU:

In H2’23, we handled 664,896 notices solely via automated means. All of these were enforced against our Community Guidelines because our Community Guidelines encapsulate illegal content. 

In addition to user-generated content and accounts, we moderate advertisements if they violate our platform policies. Below are the total ads that were reported and removed in the EU. 

Trusted Flaggers Notices (Article 15.1(b))

For the period of our latest Transparency Report (H2 2023), there were no formally appointed Trusted Flaggers under the DSA. As a result, the number of notices submitted by such Trusted Flaggers was zero (0) in this period.

Proactive Content Moderation (Article 15.1(c))

During the relevant period, Snap enforced the following content and accounts in the EU after engaging content moderation at its own initiative:

All of Snap’s own-initiative moderation efforts leveraged humans or automation. On our public content surfaces, content generally goes through both auto-moderation and human review before it is eligible for distribution to a wide audience. With regards to automated tools, these include:

  • Proactive detection of illegal and violating content using machine learning;

  • Hash-matching tools (such as PhotoDNA and Google's CSAI Match);

  • Abusive Language Detection to reject content based on an identified and regularly updated list of abusive key words, including emojis

Appeals (Article 15.1(d))

During the relevant period, Snap processed the following content and account appeals in the EU via its internal complaint-handling systems:

*停止兒童性剝削是首要任務。 Snap 為此投入大量資源,並對此類行為採取零容忍態度。  審核 CSE 申訴需要經過特殊培訓,而且由於內容形式的關係,處理這些審核的團隊人數有限。  在 2023 年秋天,Snap 實施政策變更,影響某些 CSE 執行的一致性,我們透過人員的重新訓練及嚴格的品質保證來解決這些問題。  我們預計下一次的透明度報告將顯示縮短 CSE 申訴回覆時間及提高初始處置準確性的進展。 

內容審核的自動化方法(第 15.1(d) 條)


  • 使用機器學習主動檢測非法與違規內容;

  • 雜湊比對工具(如 PhotoDNA 和 Google 的 CSAI Match);

  • 侮辱性文字檢測,基於已識別且定期更新的侮辱性關鍵字清單以杜絕相關內容,包括表情圖案。

所有危害的自動化審核技術的準確率約為 96.61%,錯誤率約為 3.39%。

內容審核保障(第 15.1(d) 條)

我們理解內容審核存在風險,包括自動和人工審核員的偏見以及政府、政治團體或個人濫用回報系統可能對言論和集會自由造成的風險。Snapchat 通常不是政治或激進內容充斥的地方,尤其是在我們的公共空間中。 

然而,為了防範這些風險,Snap 進行了測試與培訓,並制定了強大、一致的程序,來處理非法或違規內容的報告,包括來自執法部門和政府機構。 我們持續評估和發展我們的內容審核演算法。 雖然對言論自由的潛在危害很難發現,但我們還未發現任何重大問題,並且我們為使用者提供回報問題的管道。

我們的政策及系統確保處置是一致與公平的,如上所述,為 Snapchatter 提供透過通知和申訴程序獲得有用的爭議處置結果的機會,這些程序是為了保護我們社群的利益,同時保護 Snapchatter 的權利。

我們持續改進我們的處置政策及程序,並在對付 Snapchat 上潛在有害、非法內容與活動方面取得了重大進展。 這反映在我們最新的透明度報告中所顯示的回報與處置總數的上升,以及 Snapchat 上違規率下降的趨勢。

法院外和解(第 24.1(a) 條)

在我們最新的透明度報告(2023年上半年)期間,DSA 下沒有正式任命的庭外爭議解決機構。因此,在此期間提交至此類機構處理的爭議數量為零,我們無法提供結果、解決的週轉時間中位數,以及我們執行機構決定的爭議比例。 

帳戶停用(第 24.1(a) 條)

在 2023 年上半年,我們沒有任何根據第 23 條實施的帳戶停用。Snap 的信任與安全團隊已制定程序,減少經常提交明顯毫無根據的投訴帳戶的可能性。 這些程序包括限制建立重複性的報告,以及使用電子郵件篩選,以防止經常提交明顯毫無根據的回報的使用者。 Snap 對帳戶採取適當的處置,如 Snapchat 審核、處置與申訴解釋所述,有關 Snap 帳戶處置的更多資訊,請參閱我們的透明度報告 (2023 年上半年)。此類措拖將持續審核精進。

審核者資源、專業知識和支援(第 42.2 條)

我們的內容審核團隊遍佈全球,讓我們能全天候地協助以保護 Snapchatter 的安全。 以下所列為截至 2023 年 12 月 31 日,依照審核人員的語言專長而區分的人力審核資源 (請注意,有些審核人員精通多種語言):

The above table includes all moderators who support EU member state languages as of December 31, 2023. In situations where we need additional language support, we use translation services.

Moderators are recruited using a standard job description that includes a language requirement (depending on the need). The language requirement states that the candidate should be able to demonstrate written and spoken fluency in the language and have at least one year of  work experience for entry-level positions. Candidates must meet the educational and background requirements in order to be considered. Candidates also must demonstrate an understanding of current events for the country or region of content moderation they will support.

Our moderation team applies our policies and enforcement measures to help protect our Snapchat community. Training is conducted over a multi-week period, in which new team members are educated on Snap’s policies, tools, and escalations procedures. After the training, each moderator must pass a certification exam before being permitted to review content. Our moderation team regularly participates in refresher training relevant to their workflows, particularly when we encounter policy-borderline and context-dependent cases. We also run upskilling programs, certification sessions, and quizzes to ensure all moderators are current and in compliance with all updated policies. Finally, when urgent content trends surface based on current events, we quickly disseminate policy clarifications so teams are able to respond according to Snap’s policies.

We provide our content moderation team – Snap’s “digital first responders” – with significant support and resources, including on-the-job wellness support and easy access to mental health services. 

Child Sexual Exploitation and Abuse (CSEA) Media Scanning Report


The sexual exploitation of any member of our community, especially minors, is illegal, abhorrent, and prohibited by our Community Guidelines. Preventing, detecting, and eradicating Child Sexual Exploitation and Abuse (CSEA) on our platform is a top priority for Snap, and we continually evolve our capabilities to combat these and other crimes.

We use PhotoDNA robust hash-matching and Google’s Child Sexual Abuse Imagery (CSAI) Match to identify known illegal images and videos of child sexual abuse, respectively, and report them to the U.S. National Center for Missing and Exploited Children (NCMEC), as required by law. NCMEC then, in turn, coordinates with domestic or international law enforcement, as required.


The below data is based on the result of proactive scanning using PhotoDNA and/or CSAI Match of media uploaded by a user’s camera roll to Snapchat.

Stopping child sexual exploitation is a top priority. Snap devotes significant resources toward this and has zero tolerance for such conduct.  Special training is required to review CSE appeals, and there is a limited team of agents who handle these reviews due to the graphic nature of the content.  During the fall of 2023, Snap implemented policy changes that affected the consistency of certain CSE enforcements, and we have addressed these inconsistencies through agent re-training and rigorous quality assurance.  We expect that the next transparency report will reveal progress toward improving response times for CSE appeals and improving the precision of initial enforcements.  

Content Moderation Safeguards

The safeguards applied for CSEA Media Scanning are set out in the above “Content Moderation Safeguards” section under our DSA Report.