2024 年 4 月 25日
2024 年 4 月 25日
歡迎瀏覽歐盟透明度頁面,我們會在此發布歐盟《數位服務法 (DSA)》、《視聽媒體服務指令 (AVMSD)》與《荷蘭媒體法 (DMA)》規定的詳細資訊。請注意,最新版的透明度報告語言為英文 (美國)。
為遵守數位服務法規定,Snap Group Limited 指定 Snap B.V. 為法定代理人。您可以在 dsa-enquiries [at] snapchat.com 聯絡我們的 DSA 的代表,或於 vsp-enquiries [at] snapchat.com 聯絡 AVMSD 與 DMA 的代表,您可以透過此處我們的支援網站聯絡,或於:
Snap B.V.
Keizersgracht 165, 1016 DP
Amsterdam, The Netherlands
如果您是執法機構,請按照這裡列出的步驟進行。聯絡我們時,請使用荷蘭文或英文。
針對數位服務法,我們受歐盟委員會與荷蘭消費者與市場管理局 (ACM) 的管理。針對視聽媒體服務指令與荷蘭媒體法,我們受荷蘭媒體管理局 (CvdM) 的管理
依照 DSA 第 15、24 和 42 條款要求,Snap 必須發布包含前述資訊之報告,其中包含有關 Snap 對於 Snapchat 服務中被視為「線上平台」,即聚光燈、為你推薦、公開個人檔案、地圖、特效鏡頭與廣告等內容審核。此報告從 2023 年 10 月 25 日起,每六個月必須發布一次。
Snap 每年發布兩次透明度報告,以提供 Snap 的安全工作以及平台上內容回報的性質與數量分析。您可以在這裡查看 2023 年下半年 (7 月 1 日至 12 月 31 日) 的最新報告。如需數位服務法特定的數據,請查看此頁面。
截至 2023 年 12 月 31 日,Snapchat 應用程式在歐盟的每月平均活躍使用者 (簡稱「AMAR」) 人數為 9090 萬人。這表示在過去 6 個月,平均而言,歐盟有 9090 萬已註冊使用者會在特定月份至少開啟 Snapchat 應用程式一次。
This figure breaks down by Member State as follows:
These figures were calculated to meet current DSA rules and should only be relied on for DSA purposes. We have changed how we calculate this figure over time, including in response to changing internal policy, regulator guidance and technology, and figures are not intended to be compared between periods. This may also differ from the calculations used for other active user figures we publish for other purposes.
Takedown Requests
During this period, we have received 0 takedown requests from EU member states pursuant to DSA Article 9.
Information Requests
During this period, we have received the following information requests from EU member states:
The median turnaround time to inform authorities of receipt of Information Requests is 0 minutes — we provide an automated response confirming receipt. The median turnaround time to give effect to Information Requests is ~10 days. This metric reflects the time period from when Snap receives an IR to when Snap believes the request is fully resolved. In some cases, the length of this process depends in part on the speed with which law enforcement responds to any requests for clarification from Snap necessary to process their request.
All content on Snapchat must adhere to our Community Guidelines and Terms of Service, as well as supporting terms, guidelines and explainers. Proactive detection mechanisms and reports of illegal or violating content or accounts prompt a review, at which point, our tooling systems process the request, gather relevant metadata, and route the relevant content to our moderation team via a structured user interface that is designed to facilitate effective and efficient review operations. When our moderation teams determine, either through human review or automated means, that a user has violated our Terms, we may remove the offending content or account, terminate or limit the visibility of the relevant account, and/or notify law enforcement as explained in our Snapchat Moderation, Enforcement, and Appeals Explainer. Users whose accounts are locked by our safety team for Community Guidelines violations can submit a locked account appeal, and users can appeal certain content enforcements.
Content and Account Notices (DSA Article 15.1(b))
Snap has put into place mechanisms to allow users and non-users to notify Snap of content and accounts violating our Community Guidelines and Terms of Service on the platform, including those they consider illegal pursuant to DSA Article 16. These reporting mechanisms are available in the app itself (i.e. directly from the piece of content) and on our website.
During the relevant period, we received the following content and account notices in the EU:
In H2’23, we handled 664,896 notices solely via automated means. All of these were enforced against our Community Guidelines because our Community Guidelines encapsulate illegal content.
In addition to user-generated content and accounts, we moderate advertisements if they violate our platform policies. Below are the total ads that were reported and removed in the EU.
Trusted Flaggers Notices (Article 15.1(b))
For the period of our latest Transparency Report (H2 2023), there were no formally appointed Trusted Flaggers under the DSA. As a result, the number of notices submitted by such Trusted Flaggers was zero (0) in this period.
Proactive Content Moderation (Article 15.1(c))
During the relevant period, Snap enforced the following content and accounts in the EU after engaging content moderation at its own initiative:
All of Snap’s own-initiative moderation efforts leveraged humans or automation. On our public content surfaces, content generally goes through both auto-moderation and human review before it is eligible for distribution to a wide audience. With regards to automated tools, these include:
Proactive detection of illegal and violating content using machine learning;
Hash-matching tools (such as PhotoDNA and Google's CSAI Match);
Abusive Language Detection to reject content based on an identified and regularly updated list of abusive key words, including emojis
Appeals (Article 15.1(d))
During the relevant period, Snap processed the following content and account appeals in the EU via its internal complaint-handling systems: