유럽연합
2023년 7월 1일 ~ 2023년 12월 31일

발행:

2024년 4월 25일

업데이트:

2024년 4월 25일

Welcome to our European Union (EU) transparency page, where we publish EU specific information required by the Digital Services Act (DSA), the Audiovisual Media Service Directive (AVMSD) and the Dutch Media Act (DMA). Please note that the most up-to-date version of these Transparency Reports can be found in the en-US locale.

Legal Representative 

Snap Group Limited has appointed Snap B.V. as its Legal Representative for purposes of the DSA. You can contact the representative at dsa-enquiries [at] snapchat.com for the DSA, at vsp-enquiries [at] snapchat.com for AVMSD and DMA, through our Support Site [here], or at:

Snap B.V.
Keizersgracht 165, 1016 DP
Amsterdam, The Netherlands

If you are a law enforcement agency, please follow the steps outlined here.

Please communicate in Dutch or English when contacting us.

Regulatory Authorities

For DSA, we are regulated by the European Commission, and the Netherlands Authority for Consumers and Markets (ACM). For AVMSD and the DMA, we are regulated by the Dutch Media Authority (CvdM)

DSA Transparency Report

Snap is required by Articles 15, 24 and 42 of the DSA to publish reports containing prescribed information regarding Snap’s content moderation for Snapchat’s services that are considered “online platforms,” i.e., Spotlight, For You, Public Profiles, Maps, Lenses and Advertising. This report must be published every 6 months, from 25 October 2023.

Snap publishes transparency reports twice a year to provide insight into Snap’s safety efforts and the nature and volume of content reported on our platform. Our latest report for H2 2023 (July 1- December 31) can be found here. Metrics specific to the Digital Services Act, can be found on this page.

Average Monthly Active Recipients 
(DSA Articles 24.2 and 42.3)

As of 31 December 2023, we have 90.9 million average monthly active recipients (“AMAR”) of our Snapchat app in the EU. This means that, on average over the last 6 months, 90.9 million registered users in the EU have opened the Snapchat app at least once during a given month.

This figure breaks down by Member State as follows:

These figures were calculated to meet current DSA rules and should only be relied on for DSA purposes. We have changed how we calculate this figure over time, including in response to changing internal policy, regulator guidance and technology, and figures are not intended to be compared between periods. This may also differ from the calculations used for other active user figures we publish for other purposes.


Member States Authority Requests
(DSA Article 15.1(a))

Takedown Requests 

During this period, we have received 0 takedown requests from EU member states pursuant to DSA Article 9. 

Information Requests 

During this period, we have received the following information requests from EU member states pursuant to DSA Article 10:

The median turnaround time to inform authorities of receipt of Information Requests is 0 minutes — we provide an automated response confirming receipt. The median turnaround time to give effect to Information Requests is ~10 days. This metric reflects the time period from when Snap receives an IR to when Snap believes the request is fully resolved. In some cases, the length of this process depends in part on the speed with which law enforcement responds to any requests for clarification from Snap necessary to process their request.

Content Moderation 


All content on Snapchat must adhere to our Community Guidelines and Terms of Service, as well as supporting terms, guidelines and explainers. Proactive detection mechanisms and reports of illegal or violating content or accounts prompt a review, at which point, our tooling systems process the request, gather relevant metadata, and route the relevant content to our moderation team via a structured user interface that is designed to facilitate effective and efficient review operations. When our moderation teams determine, either through human review or automated means, that a user has violated our Terms, we may remove the offending content or account, terminate or limit the visibility of the relevant account, and/or notify law enforcement as explained in our Snapchat Moderation, Enforcement, and Appeals Explainer.  Users whose accounts are locked by our safety team for Community Guidelines violations can submit a locked account appeal, and users can appeal certain content enforcements.

Content and Account Notices (DSA Article 15.1(b))

Snap has put into place mechanisms to allow users and non-users to notify Snap of content and accounts violating our Community Guidelines and Terms of Service on the platform, including those they consider illegal pursuant to DSA Article 16.  These reporting mechanisms are available in the app itself (i.e. directly from the piece of content) and on our website.

During the relevant period, we received the following content and account notices in the EU:

In H2’23, we handled 664,896 notices solely via automated means. All of these were enforced against our Community Guidelines because our Community Guidelines encapsulate illegal content. 

In addition to user-generated content and accounts, we moderate advertisements if they violate our platform policies. Below are the total ads that were reported and removed in the EU. 

Trusted Flaggers Notices (Article 15.1(b))

For the period of our latest Transparency Report (H2 2023), there were no formally appointed Trusted Flaggers under the DSA. As a result, the number of notices submitted by such Trusted Flaggers was zero (0) in this period.

Proactive Content Moderation (Article 15.1(c))

During the relevant period, Snap enforced the following content and accounts in the EU after engaging content moderation at its own initiative:

All of Snap’s own-initiative moderation efforts leveraged humans or automation. On our public content surfaces, content generally goes through both auto-moderation and human review before it is eligible for distribution to a wide audience. With regards to automated tools, these include:

  • Proactive detection of illegal and violating content using machine learning;

  • Hash-matching tools (such as PhotoDNA and Google's CSAI Match);

  • Abusive Language Detection to reject content based on an identified and regularly updated list of abusive key words, including emojis


Appeals (Article 15.1(d))

During the relevant period, Snap processed the following content and account appeals in the EU via its internal complaint-handling systems:


* 아동 성 착취를 막는 것은 최우선 순위입니다. Snap은 이를 위해 상당한 자원을 투자하고 있으며 아동 성 착취 행위를 절대 용납하지 않습니다. CSE 이의 제기를 검토하려면 특별 교육이 필요하며, 콘텐츠의 노골적인 특성으로 인해 이러한 검토를 처리하는 상담원의 수가 제한되어 있습니다. Snap은 2023년 가을에 특정 CSE 시행의 일관성에 영향을 미치는 정책 변경을 시행했으며, 상담원 재교육과 엄격한 품질 보증을 통해 이러한 불일치를 해결했습니다. 다음 투명성 보고서에서는 CSE 이의 제기에 대한 응답 시간을 개선하고 초기 집행의 정확성을 향상시키는 과정이 드러날 것으로 기대합니다.

콘텐츠 조정을 위한 자동화된 수단 (제15.1조(d))

공개 콘텐츠 화면에서 콘텐츠는 일반적으로 자동 조정 및 사람의 검토를 거쳐 광범위한 사용자에게 배포할 수 있습니다. 자동화된 도구와 관련하여 다음과 같은 기능이 있습니다.

  • 머신 러닝을 사용하여 불법 및 위반 콘텐츠를 사전 감지,

  • 해시 매칭 도구(PhotoDNA 및 Google의 CSAI 매치 등),

  • 이모티콘을 포함하여 식별되고 정기적으로 업데이트 되는 욕설 키워드의 목록을 기반으로 콘텐츠를 거부하는 욕설 감지.


모든 유해 콘텐츠에 대한 자동화 조정 기술의 정확도는 약 96.61%, 오류율은 약 3.39%였습니다.


콘텐츠 조정 보호 장치(15.1조(d))

저희는 자동 및 인간 중재자의 편견과 정부, 정치적 지지층 또는 잘 조직된 개인을 포함한 악의적인 보고로 인해 발생할 수 있는 표현 및 집회의 자유에 대한 위험을 포함하여 콘텐츠 조정과 관련된 위험이 있음을 인식합니다. Snapchat은 일반적으로 정치적이거나 행동주의자의 콘텐츠를 위한 장소가 아니며, 특히 공공 장소에서는 더욱 그렇습니다.


그럼에도 불구하고 이러한 위험으로부터 보호하기 위해 Snap은 테스트와 교육을 실시하고 법 집행 기관 및 정부 기관을 포함하여 불법 또는 위반 콘텐츠에 대한 신고를 처리하기 위한 강력하고 일관된 절차를 갖추고 있습니다. 저희는 콘텐츠 조정 알고리즘을 지속적으로 평가하고 발전시킵니다. 표현의 자유에 대한 잠재적인 피해를 감지하기 어렵지만 심각한 문제에 대해 인식하지 못한 실수가 발생할 경우 사용자가 이를 신고할 수 있는 방법을 제공합니다.


저희의 정책과 시스템은 일관되고 공정한 집행을 장려하며 위에서 설명한 바와 같이 Snapchat 사용자 개인의 권리를 보호하는 동시에 커뮤니티의 이익을 보호하는 것을 목표로 하는 통지 및 이의 제기 절차를 통해 Snapchat 사용자에게 집행 결과에 대해 유의미한 이의를 제기할 수 있는 기회를 제공합니다.

저희는 시행 정책과 프로세스를 개선하기 위해 지속적으로 노력하고 있으며 Snapchat에서 잠재적으로 유해하고 불법적인 콘텐츠와 활동을 퇴치하는 데 큰 진전을 이루었습니다. 이는 최신 투명성 보고서에 표시된 신고 및 집행 수치의 상승 추세와 전반적인 Snapchat 위반 발생률의 감소에 반영됩니다.


법정 밖 합의(24.1조(a))

최신 투명성 보고서 기간(2023년 하반기)에는 DSA에 따라 공식적으로 지정된 법정 밖 분쟁 해결 기관이 없었습니다. 그 결과 해당 기간 동안 해당 기관에 제출된 분쟁 건수는 0건이었으며 결과, 해결을 위한 평균 처리 시간 및 해당 기관의 결정을 이행한 분쟁 비율을 제공할 수 없습니다.



계정 정지(24.1조(a))

2023년 하반기에는 제23조에 따라 계정이 정지된 사례는 없습니다. Snap의 신뢰 및 안전팀은 사용자 계정이 명백히 근거 없는 통지나 불만 사항을 자주 제출할 가능성을 제한하는 절차를 갖추고 있습니다. 이러한 절차에는 중복 신고 생성을 제한하고 이메일 필터를 사용하여 명백히 근거 없는 신고를 자주 제출하는 사용자가 계속해서 신고를 제출하지 못하도록 방지하는 작업이 포함됩니다. Snap은 Snapchat 조정, 집행 및 이의 제기 설명에 설명된 대로 계정에 대해 적절한 집행 조치를 취하며 Snap의 계정 집행 수준에 관한 정보는 투명성 보고서(2023년 하반기)에서 확인할 수 있습니다. 이러한 조치는 계속해서 검토 및 반복될 것입니다.


중재자 리소스, 전문 지식 및 지원(42.2조)

당사의 콘텐츠 조정 팀은 전 세계에서 운영되므로 연중무휴 24시간 Snapchat 사용자의 안전을 유지할 수 있습니다. 아래에서 2023년 12월 31일 기준 조정자의 언어 전문 분야(일부 조정자는 여러 언어 전공)별로 인적 조정 리소스 내역을 확인할 수 있습니다.

The above table includes all moderators who support EU member state languages as of December 31, 2023. In situations where we need additional language support, we use translation services.

Moderators are recruited using a standard job description that includes a language requirement (depending on the need). The language requirement states that the candidate should be able to demonstrate written and spoken fluency in the language and have at least one year of  work experience for entry-level positions. Candidates must meet the educational and background requirements in order to be considered. Candidates also must demonstrate an understanding of current events for the country or region of content moderation they will support.

Our moderation team applies our policies and enforcement measures to help protect our Snapchat community. Training is conducted over a multi-week period, in which new team members are educated on Snap’s policies, tools, and escalations procedures. After the training, each moderator must pass a certification exam before being permitted to review content. Our moderation team regularly participates in refresher training relevant to their workflows, particularly when we encounter policy-borderline and context-dependent cases. We also run upskilling programs, certification sessions, and quizzes to ensure all moderators are current and in compliance with all updated policies. Finally, when urgent content trends surface based on current events, we quickly disseminate policy clarifications so teams are able to respond according to Snap’s policies.

We provide our content moderation team – Snap’s “digital first responders” – with significant support and resources, including on-the-job wellness support and easy access to mental health services. 

Child Sexual Exploitation and Abuse (CSEA) Media Scanning Report


Background

The sexual exploitation of any member of our community, especially minors, is illegal, abhorrent, and prohibited by our Community Guidelines. Preventing, detecting, and eradicating Child Sexual Exploitation and Abuse (CSEA) on our platform is a top priority for Snap, and we continually evolve our capabilities to combat these and other crimes.


We use PhotoDNA robust hash-matching and Google’s Child Sexual Abuse Imagery (CSAI) Match to identify known illegal images and videos of child sexual abuse, respectively, and report them to the U.S. National Center for Missing and Exploited Children (NCMEC), as required by law. NCMEC then, in turn, coordinates with domestic or international law enforcement, as required.


Report

The below data is based on the result of proactive scanning using PhotoDNA and/or CSAI Match of media uploaded by a user’s camera roll to Snapchat.

Stopping child sexual exploitation is a top priority. Snap devotes significant resources toward this and has zero tolerance for such conduct.  Special training is required to review CSE appeals, and there is a limited team of agents who handle these reviews due to the graphic nature of the content.  During the fall of 2023, Snap implemented policy changes that affected the consistency of certain CSE enforcements, and we have addressed these inconsistencies through agent re-training and rigorous quality assurance.  We expect that the next transparency report will reveal progress toward improving response times for CSE appeals and improving the precision of initial enforcements.  

Content Moderation Safeguards

The safeguards applied for CSEA Media Scanning are set out in the above “Content Moderation Safeguards” section under our DSA Report.


European Union Terrorist Content Online Transparency Report

Published: June 17, 2024

Last Updated: June 17, 2024

This Transparency Report is published in accordance with Articles 7(2) and 7(3) of Regulation 2021/784 of the European Parliament and of the Council of the EU, addressing the dissemination of terrorist content online (the Regulation). It covers the reporting period of January 1 - December 31, 2023


General Information
  • Article 7(3)(a): information about the hosting service provider’s measures in relation to the identification and removal of or disabling of access to terrorist content

  • Article 7(3)(b): information about the hosting service provider’s measures to address the reappearance online of material which has previously been removed or to which access has been disabled because it was considered to be terrorist content, in particular where automated tools have been used


Terrorists, terrorist organizations, and violent extremists are prohibited from using Snapchat. Content that advocates, promotes, glorifies, or advances terrorism or other violent, criminal acts is prohibited under our Community Guidelines. Users are able to report content that violates our Community Guidelines via our in-app reporting menu and our Support Site. We also use proactive detection to attempt to identify violative content on public surfaces like ​​Spotlight and Discover. 


Regardless as to how we may become aware of violating content, our Trust & Safety teams, through a combination of automation and human moderation, promptly review identified content and make enforcement decisions. Enforcements may include removing the content, warning or locking the violating account, and, if warranted, reporting the account to law enforcement. To prevent the reappearance of terrorist or other violent extremist content on Snapchat, in addition to working with law enforcement, we take steps to block the device associated with the violating account and prevent the user from creating another Snapchat account. 


Additional details regarding our measures for identifying and removing terrorist content can be found in our Explainer on Hateful Content, Terrorism, and Violent Extremism and our Explainer on Moderation, Enforcement, and Appeals



Reports & Enforcements 
  • Article 7(3)(c): the number of items of terrorist content removed or to which access has been disabled following removal orders or specific measures, and the number of removal orders where the content has not been removed or access to which has not been disabled pursuant to the first subparagraph of Article 3(7) and the first subparagraph of Article 3(8), together with the grounds therefor


During the reporting period, Snap did not receive any removal orders, nor were we required to implement any specific measures pursuant to Article 5 of the Regulation. Accordingly, we were not required to take enforcement action under the Regulation.


The following table describes enforcement actions taken based on user reports and proactive detection against content and accounts, both in the EU and elsewhere around the world, that violated our Community Guidelines relating to terrorism and violent extremism content

Enforcement Appeals
  • Article 7(3)(d): the number and the outcome of complaints handled by the hosting service provider in accordance with Article 10

  • Article 7(3)(g): the number of cases in which the hosting service provider reinstated content or access thereto following a complaint by the content provider


Because we had no enforcement actions required under the Regulation during the reporting period as noted above, we handled no complaints pursuant to Article 10 of the Regulation and had no associated reinstatements.


The following table contains information relating to appeals and reinstatements, both in the EU and elsewhere around the world, involving terrorist and violent extremist content enforced under our Community Guidelines.

Judicial Proceedings & Appeals
  • Article 7(3)(e): the number and the outcome of administrative or judicial review proceedings brought by the hosting service provider

  • Article 7(3)(f): the number of cases in which the hosting service provider was required to reinstate content or access thereto as a result of administrative or judicial review proceedings


As we had no enforcement actions required under the Regulation during the reporting period, as noted above, we had no associated administrative or judicial review proceedings, and we were not required to reinstate content as a result of any such proceedings.