2024년 4월 25일
2024년 4월 25일
디지털 서비스법(DSA), 시청각 미디어 서비스 지침(AVMSD), 네덜란드 미디어법(DMA) 및 테러 콘텐츠 온라인 규정(TCO)에서 요구하는 EU 특정 정보가 게시되는 유럽 연합(EU) 투명성 페이지에 오신 것을 환영합니다. 이러한 투명성 보고서의 최신 버전은 en-US 지역 설정에서 찾을 수 있습니다.
Snap Group Limited는 DSA의 목적을 위해 Snap B.V.를 법적 대리인으로 임명했습니다. 고객지원 사이트[여기]를 통해 DSA와 관련해서는 dsa-enquiries [at] snapchat.com, AVMSD 및 DMA와 관련해서는 vsp-enquiries [at] snapchat.com, TCO와 관련해서는 tco-enquiries [at] snapchat.com에서, 또는 다음 주소로 담당자에게 연락할 수 있습니다.
Snap B.V.
Keizersgracht 165, 1016 DP
Amsterdam, The Netherlands
법 집행 기관인 경우 여기에 설명된 단계를 따르십시오.
저희에게 연락할 때 영어 또는 네덜란드어로 연락하십시오.
DSA의 경우, 저희는 유럽 위원회와 네덜란드 소비자 시장당국(ACM)의 규제를 받습니다. AVMSD 및 DMA의 경우에는 네덜란드 미디어위원회(CvdM)의 규제를 받습니다. TCO의 경우, 저희는 온라인 테러 콘텐츠 및 아동 성적 학대 자료(ATKM) 예방 관련 네덜란드 당국의 규제를 받습니다.
Snap은 DSA 제15조, 24조 및 42조에 따라 '온라인 플랫폼'(예: 스포트라이트, For You, 공개 프로필, 지도, 렌즈 및 광고)으로 간주되는 Snapchat의 서비스에 대한 Snap의 콘텐츠 조정과 관련된 소정의 정보가 포함된 보고서를 게시해야 합니다. 이 보고서는 2023년 10월 25일부터 6개월마다 발행해야 합니다.
Snap은 매년 두 번 투명성 보고서를 발행하여 Snap의 안전 노력과 플랫폼에 보고된 콘텐츠의 특성과 양에 대한 인사이트를 제공합니다. 2023년 하반기(7월 1일 ~ 12월 31일)에 대한 최신 보고서는 여기에서 찾을 수 있습니다(2024년 8월 1일 기준 월간 평균 활성 수신자 수치에 대한 업데이트는 페이지 하단 참조). 디지털 서비스법에 대한 구체적인 측정 지표는 이 페이지에서 찾을 수 있습니다.
2023년 12월 31일 기준, EU에서 Snapchat 앱의 월간 평균 활성 수신자 수(“AMAR”)는 9,090만 명입니다. 이는 지난 6개월 동안 평균적으로 EU에 있는 9,090만 명의 등록된 사용자가 모든 달에 한 번 이상 Snapchat 앱을 열었음을 의미합니다.
This figure breaks down by Member State as follows:
These figures were calculated to meet current DSA rules and should only be relied on for DSA purposes. We have changed how we calculate this figure over time, including in response to changing internal policy, regulator guidance and technology, and figures are not intended to be compared between periods. This may also differ from the calculations used for other active user figures we publish for other purposes.
Takedown Requests
During this period, we have received 0 takedown requests from EU member states pursuant to DSA Article 9.
Information Requests
During this period, we have received the following information requests from EU member states:
The median turnaround time to inform authorities of receipt of Information Requests is 0 minutes — we provide an automated response confirming receipt. The median turnaround time to give effect to Information Requests is ~10 days. This metric reflects the time period from when Snap receives an IR to when Snap believes the request is fully resolved. In some cases, the length of this process depends in part on the speed with which law enforcement responds to any requests for clarification from Snap necessary to process their request.
All content on Snapchat must adhere to our Community Guidelines and Terms of Service, as well as supporting terms, guidelines and explainers. Proactive detection mechanisms and reports of illegal or violating content or accounts prompt a review, at which point, our tooling systems process the request, gather relevant metadata, and route the relevant content to our moderation team via a structured user interface that is designed to facilitate effective and efficient review operations. When our moderation teams determine, either through human review or automated means, that a user has violated our Terms, we may remove the offending content or account, terminate or limit the visibility of the relevant account, and/or notify law enforcement as explained in our Snapchat Moderation, Enforcement, and Appeals Explainer. Users whose accounts are locked by our safety team for Community Guidelines violations can submit a locked account appeal, and users can appeal certain content enforcements.
Content and Account Notices (DSA Article 15.1(b))
Snap has put into place mechanisms to allow users and non-users to notify Snap of content and accounts violating our Community Guidelines and Terms of Service on the platform, including those they consider illegal pursuant to DSA Article 16. These reporting mechanisms are available in the app itself (i.e. directly from the piece of content) and on our website.
During the relevant period, we received the following content and account notices in the EU:
In H2’23, we handled 664,896 notices solely via automated means. All of these were enforced against our Community Guidelines because our Community Guidelines encapsulate illegal content.
In addition to user-generated content and accounts, we moderate advertisements if they violate our platform policies. Below are the total ads that were reported and removed in the EU.
Trusted Flaggers Notices (Article 15.1(b))
For the period of our latest Transparency Report (H2 2023), there were no formally appointed Trusted Flaggers under the DSA. As a result, the number of notices submitted by such Trusted Flaggers was zero (0) in this period.
Proactive Content Moderation (Article 15.1(c))
During the relevant period, Snap enforced the following content and accounts in the EU after engaging content moderation at its own initiative:
All of Snap’s own-initiative moderation efforts leveraged humans or automation. On our public content surfaces, content generally goes through both auto-moderation and human review before it is eligible for distribution to a wide audience. With regards to automated tools, these include:
Proactive detection of illegal and violating content using machine learning;
Hash-matching tools (such as PhotoDNA and Google's CSAI Match);
Abusive Language Detection to reject content based on an identified and regularly updated list of abusive key words, including emojis
Appeals (Article 15.1(d))
During the relevant period, Snap processed the following content and account appeals in the EU via its internal complaint-handling systems: