2024 年 4 月 25 日
2024 年 4 月 25 日
欢迎访问我们的欧盟 (EU) 透明度页面,我们在此发布《数字服务法案》(DSA)、《视听媒体服务指令》(AVMSD) 和《荷兰媒体法案》(DMA) 所要求的欧盟特定信息。请注意,这些透明度报告的最新版本可以在 en-US 区域中找到。
Snap Group Limited 已指定 Snap B.V. 作为其 DSA 法定代表人。 您可以在我们的支持网站[此处]或以下位置,通过 dsa-enquiries@snapchat.com 联系 DSA 代表,通过 vsp-enquiries@snapchat.com 联系 AVMSD 和 DMA 的代表:
Snap B.V.
Keizersgracht 165, 1016 DP
Amsterdam, The Netherlands
如果您是执法机构,请遵循 此处概述的步骤。联系我们时,请使用荷兰语或英语交流。
关于 DSA,我们受欧盟委员会和荷兰消费者和市场管理局 (ACM) 监管。关于 AVMSD 和 DMA,我们受荷兰媒体管理局 (CvdM) 监管。
DSA 第 15、24 和 42 条要求 Snap 发布报告,其中应包含有关 Snap 对 Snapchat 服务(视为“在线平台”)(即 Spotlight、 为您推荐、公众号、地图、特效镜头和广告)的内容审核的规定信息。自 2023 年 10 月 25 日起,该报告必须每 6 个月发布一次。
Snap 每年发布两次透明度报告,深入介绍 Snap 的安全工作以及我们平台上所举报内容的性质和数量。 我们 2023 年下半年(7 月 1 日至 12 月 31 日)的最新报告可点击 此处查看。 《数字服务法案》的特定指标可在此页面上找到。
截止 2023 年 12 月 31 日,我们的 Snapchat 应用程序在欧盟的平均月度活跃接收用户达 9090 万。这意味着,在过去 6 个月中,平均有 9090 万欧盟注册用户在指定月份内至少打开过一次 Snapchat 应用程序。
This figure breaks down by Member State as follows:
These figures were calculated to meet current DSA rules and should only be relied on for DSA purposes. We have changed how we calculate this figure over time, including in response to changing internal policy, regulator guidance and technology, and figures are not intended to be compared between periods. This may also differ from the calculations used for other active user figures we publish for other purposes.
Takedown Requests
During this period, we have received 0 takedown requests from EU member states pursuant to DSA Article 9.
Information Requests
During this period, we have received the following information requests from EU member states:
The median turnaround time to inform authorities of receipt of Information Requests is 0 minutes — we provide an automated response confirming receipt. The median turnaround time to give effect to Information Requests is ~10 days. This metric reflects the time period from when Snap receives an IR to when Snap believes the request is fully resolved. In some cases, the length of this process depends in part on the speed with which law enforcement responds to any requests for clarification from Snap necessary to process their request.
All content on Snapchat must adhere to our Community Guidelines and Terms of Service, as well as supporting terms, guidelines and explainers. Proactive detection mechanisms and reports of illegal or violating content or accounts prompt a review, at which point, our tooling systems process the request, gather relevant metadata, and route the relevant content to our moderation team via a structured user interface that is designed to facilitate effective and efficient review operations. When our moderation teams determine, either through human review or automated means, that a user has violated our Terms, we may remove the offending content or account, terminate or limit the visibility of the relevant account, and/or notify law enforcement as explained in our Snapchat Moderation, Enforcement, and Appeals Explainer. Users whose accounts are locked by our safety team for Community Guidelines violations can submit a locked account appeal, and users can appeal certain content enforcements.
Content and Account Notices (DSA Article 15.1(b))
Snap has put into place mechanisms to allow users and non-users to notify Snap of content and accounts violating our Community Guidelines and Terms of Service on the platform, including those they consider illegal pursuant to DSA Article 16. These reporting mechanisms are available in the app itself (i.e. directly from the piece of content) and on our website.
During the relevant period, we received the following content and account notices in the EU:
In H2’23, we handled 664,896 notices solely via automated means. All of these were enforced against our Community Guidelines because our Community Guidelines encapsulate illegal content.
In addition to user-generated content and accounts, we moderate advertisements if they violate our platform policies. Below are the total ads that were reported and removed in the EU.
Trusted Flaggers Notices (Article 15.1(b))
For the period of our latest Transparency Report (H2 2023), there were no formally appointed Trusted Flaggers under the DSA. As a result, the number of notices submitted by such Trusted Flaggers was zero (0) in this period.
Proactive Content Moderation (Article 15.1(c))
During the relevant period, Snap enforced the following content and accounts in the EU after engaging content moderation at its own initiative:
All of Snap’s own-initiative moderation efforts leveraged humans or automation. On our public content surfaces, content generally goes through both auto-moderation and human review before it is eligible for distribution to a wide audience. With regards to automated tools, these include:
Proactive detection of illegal and violating content using machine learning;
Hash-matching tools (such as PhotoDNA and Google's CSAI Match);
Abusive Language Detection to reject content based on an identified and regularly updated list of abusive key words, including emojis
Appeals (Article 15.1(d))
During the relevant period, Snap processed the following content and account appeals in the EU via its internal complaint-handling systems: