Unión Europea
1 de julio de 2023 – 31 de diciembre de 2023

Publicado:

25 de abril, 2024

Actualizado:

25 de abril, 2024

Welcome to our European Union (EU) transparency page, where we publish EU specific information required by the Digital Services Act (DSA), the Audiovisual Media Service Directive (AVMSD) and the Dutch Media Act (DMA). Please note that the most up-to-date version of these Transparency Reports can be found in the en-US locale.

Legal Representative 

Snap Group Limited has appointed Snap B.V. as its Legal Representative for purposes of the DSA. You can contact the representative at dsa-enquiries [at] snapchat.com for the DSA, at vsp-enquiries [at] snapchat.com for AVMSD and DMA, through our Support Site [here], or at:

Snap B.V.
Keizersgracht 165, 1016 DP
Amsterdam, The Netherlands

If you are a law enforcement agency, please follow the steps outlined here.

Please communicate in Dutch or English when contacting us.

Regulatory Authorities

For DSA, we are regulated by the European Commission, and the Netherlands Authority for Consumers and Markets (ACM). For AVMSD and the DMA, we are regulated by the Dutch Media Authority (CvdM)

DSA Transparency Report

Snap is required by Articles 15, 24 and 42 of the DSA to publish reports containing prescribed information regarding Snap’s content moderation for Snapchat’s services that are considered “online platforms,” i.e., Spotlight, For You, Public Profiles, Maps, Lenses and Advertising. This report must be published every 6 months, from 25 October 2023.

Snap publishes transparency reports twice a year to provide insight into Snap’s safety efforts and the nature and volume of content reported on our platform. Our latest report for H2 2023 (July 1- December 31) can be found here. Metrics specific to the Digital Services Act, can be found on this page.

Average Monthly Active Recipients 
(DSA Articles 24.2 and 42.3)

As of 31 December 2023, we have 90.9 million average monthly active recipients (“AMAR”) of our Snapchat app in the EU. This means that, on average over the last 6 months, 90.9 million registered users in the EU have opened the Snapchat app at least once during a given month.

This figure breaks down by Member State as follows:

These figures were calculated to meet current DSA rules and should only be relied on for DSA purposes. We have changed how we calculate this figure over time, including in response to changing internal policy, regulator guidance and technology, and figures are not intended to be compared between periods. This may also differ from the calculations used for other active user figures we publish for other purposes.


Member States Authority Requests
(DSA Article 15.1(a))

Takedown Requests 

During this period, we have received 0 takedown requests from EU member states pursuant to DSA Article 9. 

Information Requests 

During this period, we have received the following information requests from EU member states pursuant to DSA Article 10:

The median turnaround time to inform authorities of receipt of Information Requests is 0 minutes — we provide an automated response confirming receipt. The median turnaround time to give effect to Information Requests is ~10 days. This metric reflects the time period from when Snap receives an IR to when Snap believes the request is fully resolved. In some cases, the length of this process depends in part on the speed with which law enforcement responds to any requests for clarification from Snap necessary to process their request.

Content Moderation 


All content on Snapchat must adhere to our Community Guidelines and Terms of Service, as well as supporting terms, guidelines and explainers. Proactive detection mechanisms and reports of illegal or violating content or accounts prompt a review, at which point, our tooling systems process the request, gather relevant metadata, and route the relevant content to our moderation team via a structured user interface that is designed to facilitate effective and efficient review operations. When our moderation teams determine, either through human review or automated means, that a user has violated our Terms, we may remove the offending content or account, terminate or limit the visibility of the relevant account, and/or notify law enforcement as explained in our Snapchat Moderation, Enforcement, and Appeals Explainer.  Users whose accounts are locked by our safety team for Community Guidelines violations can submit a locked account appeal, and users can appeal certain content enforcements.

Content and Account Notices (DSA Article 15.1(b))

Snap has put into place mechanisms to allow users and non-users to notify Snap of content and accounts violating our Community Guidelines and Terms of Service on the platform, including those they consider illegal pursuant to DSA Article 16.  These reporting mechanisms are available in the app itself (i.e. directly from the piece of content) and on our website.

During the relevant period, we received the following content and account notices in the EU:

In H2’23, we handled 664,896 notices solely via automated means. All of these were enforced against our Community Guidelines because our Community Guidelines encapsulate illegal content. 

In addition to user-generated content and accounts, we moderate advertisements if they violate our platform policies. Below are the total ads that were reported and removed in the EU. 

Trusted Flaggers Notices (Article 15.1(b))

For the period of our latest Transparency Report (H2 2023), there were no formally appointed Trusted Flaggers under the DSA. As a result, the number of notices submitted by such Trusted Flaggers was zero (0) in this period.

Proactive Content Moderation (Article 15.1(c))

During the relevant period, Snap enforced the following content and accounts in the EU after engaging content moderation at its own initiative:

All of Snap’s own-initiative moderation efforts leveraged humans or automation. On our public content surfaces, content generally goes through both auto-moderation and human review before it is eligible for distribution to a wide audience. With regards to automated tools, these include:

  • Proactive detection of illegal and violating content using machine learning;

  • Hash-matching tools (such as PhotoDNA and Google's CSAI Match);

  • Abusive Language Detection to reject content based on an identified and regularly updated list of abusive key words, including emojis


Appeals (Article 15.1(d))

During the relevant period, Snap processed the following content and account appeals in the EU via its internal complaint-handling systems:


* Detener la explotación sexual infantil es una prioridad absoluta. Snap dedica recursos significativos a esto y tiene tolerancia cero para dicha conducta.  Se requiere capacitación especial para revisar las apelaciones de CSE y hay un equipo limitado de agentes que se encargan de estas revisiones debido a la naturaleza gráfica del contenido.  Durante el otoño de 2023, Snap implementó cambios en la política que afectaron la consistencia de ciertas medidas de cumplimiento de las CSE y hemos abordado estas inconsistencias a través del reentrenamiento de los agentes y un riguroso control de calidad.  Esperamos que el próximo informe de transparencia revele los avances hacia la mejora de los tiempos de respuesta para las apelaciones de CSE y la mejora de la precisión de las medidas iniciales de cumplimiento. 

Medios automatizados para la moderación de contenido (Artículo 15.1(d))

En nuestras superficies de contenido público, el contenido generalmente pasa por moderación automática y revisión humana antes de que sea elegible para su distribución a una audiencia amplia. Con respecto a las herramientas automatizadas, estas incluyen:

  • Detección proactiva de contenido ilegal y que viole las normas usando aprendizaje automático;

  • Herramientas de coincidencia de hash (como PhotoDNA y CSAI Match de Google);

  • Detección de lenguaje abusivo para rechazar contenido en función de una lista identificada y actualizada regularmente de palabras clave abusivas, incluidos emojis.


La precisión de las tecnologías de moderación automatizada para todos los daños fue de aproximadamente el 96,61 % y la tasa de error fue de aproximadamente el 3,39 %.


Salvaguardas de la moderación de contenido (Artículos 15.1(d))

Reconocemos que hay riesgos asociados a la moderación de contenido, incluidos los riesgos para las libertades de expresión y los asociados, que pueden estar causados por el sesgo automatizado y el moderador humano y por los informes abusivos, incluso por parte de gobiernos, las instituciones políticas o personas bien organizadas. Snapchat generalmente no es un lugar para contenido político o activista, particularmente en nuestros espacios públicos. 


Sin embargo, para salvaguardar contra estos riesgos, Snap ha implementado pruebas y capacitación y tiene procedimientos sólidos y consistentes para manejar los informes de contenido ilegal o que viole la ley, incluso de las autoridades gubernamentales y las fuerzas de seguridad y los gobiernos. Evaluamos y evolucionamos continuamente nuestros algoritmos de moderación de contenido. Si bien los daños potenciales a la libertad de expresión son difíciles de detectar, no somos conscientes de ningún problema significativo y proporcionamos vías para que nuestros usuarios informen los errores si se producen. 


Nuestras políticas y sistemas promueven el cumplimiento consistente y justo de las normas y, como se describió anteriormente, proporcionan a los Snapchatters la oportunidad de disputar de forma significativa los resultados del cumplimiento de las normas a través de los procesos de notificaciones y apelaciones que tienen como objetivo salvaguardar los intereses de nuestra comunidad al tiempo que protegen los derechos individuales de los Snapchatters.

Nos esforzamos continuamente para mejorar nuestras políticas y procesos de cumplimiento y hemos hecho grandes avances para combatir el contenido y las actividades potencialmente dañinos e ilegales en Snapchat. Esto se refleja en una tendencia al alza en las cifras de denuncias y cumplimiento de las normas que se muestran en nuestro más reciente informe de transparencia y las tasas de prevalencia decrecientes de las violaciones de los derechos en Snapchat en general.


Acuerdos extrajudiciales (Artículo 24.1(a))

Durante el período de nuestro más reciente informe de transparencia (segundo semestre de 2023), no hubo órganos de solución de controversias extrajudiciales designados formalmente en el marco de la DSA. Como resultado, el número de disputas enviadas a dichos órganos fue de cero (0) en este período y no podemos proporcionar resultados, los tiempos de respuesta medios para los acuerdos y el porcentaje de disputas en los que implementamos las decisiones del organismo. 



Suspensiones de cuentas (Artículo 24.1(a))

Durante el segundo semestre de 2023, no realizamos ninguna suspensión de cuentas impuesta de conformidad con el artículo 23. El equipo de Confianza y seguridad de Snap tiene procedimientos establecidos para limitar la posibilidad de que las cuentas de los usuarios envíen con frecuencia avisos o quejas que son manifiestamente infundadas. Estos procedimientos incluyen restringir la creación de denuncias duplicadas y el uso de filtros de correo electrónico para evitar que los usuarios que han enviado con frecuencia denuncias manifiestamente infundadas continúen haciéndolo. Snap toma las medidas de cumplimiento apropiadas contra las cuentas como se explica en nuestra Explicación de la moderación de Snapchat, cumplimiento de normas y apelaciones. La información sobre el nivel de cumplimiento de las normas de las cuentas de Snap se puede encontrar en nuestro informe de transparencia (segundo semestre de 2023). Dichas medidas seguirán revisándose e iterándose.


Recursos, la experiencia y el apoyo del moderador (Artículo 42.2)

Nuestro equipo de moderación de contenido opera en todo el mundo, lo que nos permite ayudar a mantener a los Snapchatters seguros las 24 horas del día, los 7 días de la semana. A continuación, encontrarás el desglose de nuestros recursos de moderación humana por las especialidades de idiomas de los moderadores (tené en cuenta que algunos moderadores se especializan en múltiples idiomas) a partir del 31 de diciembre de 2023:

The above table includes all moderators who support EU member state languages as of December 31, 2023. In situations where we need additional language support, we use translation services.

Moderators are recruited using a standard job description that includes a language requirement (depending on the need). The language requirement states that the candidate should be able to demonstrate written and spoken fluency in the language and have at least one year of  work experience for entry-level positions. Candidates must meet the educational and background requirements in order to be considered. Candidates also must demonstrate an understanding of current events for the country or region of content moderation they will support.

Our moderation team applies our policies and enforcement measures to help protect our Snapchat community. Training is conducted over a multi-week period, in which new team members are educated on Snap’s policies, tools, and escalations procedures. After the training, each moderator must pass a certification exam before being permitted to review content. Our moderation team regularly participates in refresher training relevant to their workflows, particularly when we encounter policy-borderline and context-dependent cases. We also run upskilling programs, certification sessions, and quizzes to ensure all moderators are current and in compliance with all updated policies. Finally, when urgent content trends surface based on current events, we quickly disseminate policy clarifications so teams are able to respond according to Snap’s policies.

We provide our content moderation team – Snap’s “digital first responders” – with significant support and resources, including on-the-job wellness support and easy access to mental health services. 

Child Sexual Exploitation and Abuse (CSEA) Media Scanning Report


Background

The sexual exploitation of any member of our community, especially minors, is illegal, abhorrent, and prohibited by our Community Guidelines. Preventing, detecting, and eradicating Child Sexual Exploitation and Abuse (CSEA) on our platform is a top priority for Snap, and we continually evolve our capabilities to combat these and other crimes.


We use PhotoDNA robust hash-matching and Google’s Child Sexual Abuse Imagery (CSAI) Match to identify known illegal images and videos of child sexual abuse, respectively, and report them to the U.S. National Center for Missing and Exploited Children (NCMEC), as required by law. NCMEC then, in turn, coordinates with domestic or international law enforcement, as required.


Report

The below data is based on the result of proactive scanning using PhotoDNA and/or CSAI Match of media uploaded by a user’s camera roll to Snapchat.

Stopping child sexual exploitation is a top priority. Snap devotes significant resources toward this and has zero tolerance for such conduct.  Special training is required to review CSE appeals, and there is a limited team of agents who handle these reviews due to the graphic nature of the content.  During the fall of 2023, Snap implemented policy changes that affected the consistency of certain CSE enforcements, and we have addressed these inconsistencies through agent re-training and rigorous quality assurance.  We expect that the next transparency report will reveal progress toward improving response times for CSE appeals and improving the precision of initial enforcements.  

Content Moderation Safeguards

The safeguards applied for CSEA Media Scanning are set out in the above “Content Moderation Safeguards” section under our DSA Report.


European Union Terrorist Content Online Transparency Report

Published: June 17, 2024

Last Updated: June 17, 2024

This Transparency Report is published in accordance with Articles 7(2) and 7(3) of Regulation 2021/784 of the European Parliament and of the Council of the EU, addressing the dissemination of terrorist content online (the Regulation). It covers the reporting period of January 1 - December 31, 2023


General Information
  • Article 7(3)(a): information about the hosting service provider’s measures in relation to the identification and removal of or disabling of access to terrorist content

  • Article 7(3)(b): information about the hosting service provider’s measures to address the reappearance online of material which has previously been removed or to which access has been disabled because it was considered to be terrorist content, in particular where automated tools have been used


Terrorists, terrorist organizations, and violent extremists are prohibited from using Snapchat. Content that advocates, promotes, glorifies, or advances terrorism or other violent, criminal acts is prohibited under our Community Guidelines. Users are able to report content that violates our Community Guidelines via our in-app reporting menu and our Support Site. We also use proactive detection to attempt to identify violative content on public surfaces like ​​Spotlight and Discover. 


Regardless as to how we may become aware of violating content, our Trust & Safety teams, through a combination of automation and human moderation, promptly review identified content and make enforcement decisions. Enforcements may include removing the content, warning or locking the violating account, and, if warranted, reporting the account to law enforcement. To prevent the reappearance of terrorist or other violent extremist content on Snapchat, in addition to working with law enforcement, we take steps to block the device associated with the violating account and prevent the user from creating another Snapchat account. 


Additional details regarding our measures for identifying and removing terrorist content can be found in our Explainer on Hateful Content, Terrorism, and Violent Extremism and our Explainer on Moderation, Enforcement, and Appeals



Reports & Enforcements 
  • Article 7(3)(c): the number of items of terrorist content removed or to which access has been disabled following removal orders or specific measures, and the number of removal orders where the content has not been removed or access to which has not been disabled pursuant to the first subparagraph of Article 3(7) and the first subparagraph of Article 3(8), together with the grounds therefor


During the reporting period, Snap did not receive any removal orders, nor were we required to implement any specific measures pursuant to Article 5 of the Regulation. Accordingly, we were not required to take enforcement action under the Regulation.


The following table describes enforcement actions taken based on user reports and proactive detection against content and accounts, both in the EU and elsewhere around the world, that violated our Community Guidelines relating to terrorism and violent extremism content

Enforcement Appeals
  • Article 7(3)(d): the number and the outcome of complaints handled by the hosting service provider in accordance with Article 10

  • Article 7(3)(g): the number of cases in which the hosting service provider reinstated content or access thereto following a complaint by the content provider


Because we had no enforcement actions required under the Regulation during the reporting period as noted above, we handled no complaints pursuant to Article 10 of the Regulation and had no associated reinstatements.


The following table contains information relating to appeals and reinstatements, both in the EU and elsewhere around the world, involving terrorist and violent extremist content enforced under our Community Guidelines.

Judicial Proceedings & Appeals
  • Article 7(3)(e): the number and the outcome of administrative or judicial review proceedings brought by the hosting service provider

  • Article 7(3)(f): the number of cases in which the hosting service provider was required to reinstate content or access thereto as a result of administrative or judicial review proceedings


As we had no enforcement actions required under the Regulation during the reporting period, as noted above, we had no associated administrative or judicial review proceedings, and we were not required to reinstate content as a result of any such proceedings.