Snap Values
투명성 보고서
2024년 1월 1일 ~ 2024년 6월 30일

발행:

2024년 12월 5일

업데이트:

2024년 12월 5일

당사는 Snap의 안전 노력에 대한 통찰력을 제공하기 위해 연 2회 이 투명성 보고서를 발행합니다. 이러한 노력에 최선을 다하고 있으며, 콘텐츠 조정, 법 집행 관행, Snapchat 커뮤니티의 안전과 웰빙에 깊은 관심이 깊은 이해 관계자를 위해 이러한 보고서를 보다 포괄적이고 유익하게 제공하기 위해 지속적으로 노력하고 있습니다. 

이 투명성 보고서는 2024년 상반기(1월 1일 ~ 6월 30일)를 다루고 있습니다. 이전 보고서와 마찬가지로, 저희 신뢰 및 안전팀은 전 세계적으로 접수된 앱 내 콘텐츠 및 계정 관련 신고 건수와 특정 커뮤니티 가이드라인 위반 사항에 대해 공유합니다. 또한 법 집행 기관 및 정부의 요청에 어떻게 대응했는지에 대한 데이터를 공유합니다. 저작권 및 상표권 침해 고지에 대한 대응 방식 관련 데이터를 공유합니다. 또한 이 페이지 하단에 링크된 파일에서 국가별 인사이트를 제공합니다.

투명성 보고서를 지속적으로 개선하기 위한 노력의 일환으로, 더 넓은 범주의 커뮤니티 가이드라인 위반을 감지하고 시행하며 사전 예방적 노력을 강조하는 새로운 데이터도 도입하고 있습니다. 이 보고서에는 글로벌 및. 국가 수준의 데이터가 모두 포함되어 있으며, 앞으로도 계속 그렇게 할 것입니다. 또한 이전 보고서의 라벨 오류도 수정했습니다. 이전에 '시행된 총 콘텐츠'라고 표기했던 것을 '총 시행 건수'로 변경해, 해당 열의 데이터가 콘텐츠와 계정 수준의 시행을 모두 포함한다는 점을 반영했습니다.

잠재적인 온라인 피해 방지정책과 신고 관행을 지속적으로 발전시키기 위한 계획에 대한 자세한 내용은 이 투명성 보고서에 대한 최근 안전 및 영향 블로그를 읽어보십시오. Snapchat에서 안전 및 개인정보보호에 관한 추가 자료를 찾으시려면 해당 페이지 하단에 있는 당사의 투명성 보고서 소개 탭을 참고하십시오.

이러한 투명성 보고서의 최신 버전은 en-US 지역 설정에서 찾을 수 있습니다.

커뮤니티 가이드라인 시행을 위한 신뢰 및 안전팀의 조치 개요

신뢰 및 안전팀은 이 보고서의 다음 섹션에 자세히 설명된 대로 사전 예방적(자동화된 도구 사용) 및 사후 대응적(신고에 대한 대응)으로 커뮤니티 가이드라인을 시행합니다. 이 보고 주기(2024년 상반기)에 신뢰 및 안전팀은 다음 시행 조치를 취했습니다. 

Below is a breakdown per type of Community Guidelines violations concerned, including the median turnaround time between the time we detected the violation (either proactively or upon receipt of a report) and the time we took final action on the relevant content or account:

During the reporting period, we saw a Violative View Rate (VVR) of 0.01 percent, which means that out of every 10,000 Snap and Story views on Snapchat, 1 contained content found to violate our Community Guidelines.

Community Guidelines Violations Reported to Our Trust & Safety Teams

From January 1 - June 30, 2024, in response to in-app reports of violations of our Community Guidelines, Snap’s Trust & Safety teams took a total of 6,223,618 enforcement actions globally, including enforcements against 3,842,507 unique accounts. The median turnaround time for our Trust & Safety teams to take enforcement action in response to those reports was ~24 minutes. A breakdown per reporting category is provided below. 

Analysis

Our overall reporting volumes remained fairly stable in H1 2024, as compared to the previous six months. This cycle, we saw an increase in total enforcements and total unique accounts enforced by approximately 16%.

Over the last 12 months, Snap introduced new reporting mechanisms for users, which account for changes to our reported and enforced volumes and for increases in turnaround times in this reporting period (H1 2024). Specifically:

  • Group Chat Reporting: We introduced Group Chat Reporting on October 13, 2023, which enables users to report abuse occurring in a multi-person chat. This change impacted the makeup of our metrics across reporting categories (because some potential harms are more likely to occur in a chat context) and increased report actionability. 

  • Account Reporting Enhancements: We also evolved our Account Reporting feature to give reporting users an option to submit chat evidence when reporting an account suspected of being operated by a bad actor. This change, which provides us with greater evidence and context to assess account reports, launched on February 29, 2024. 


Chat Reports, and especially Group Chat Reports, are among the most complex and time-consuming to review, which drove up turnaround times across the board. 

Reporting for suspected Child Sexual Exploitation & Abuse (CSEA), Harassment & Bullying, and Hate Speech were particularly impacted by the two changes described above, and by shifts in the broader ecosystem. Specifically:

  • CSEA: We observed an increase in CSEA-related reports and enforcements in H1 2024. Specifically, we saw a 64% increase in total in-app reports by users, an 82% increase in total enforcements, and a 108% increase in total unique accounts enforced. These increases are largely driven by the introduction of Group Chat and Account Reporting functionalities. Given the sensitive nature of this moderation queue, a select team of highly trained agents is assigned to review reports of potential CSEA-related violations. The influx of additional reports combined with our teams adapting to new trainings has resulted in an increase in turnaround times. Moving forward, we have increased the size of our global vendor teams significantly to reduce turnaround times and accurately enforce on reports of potential CSEA. We expect that our H2 2024 Transparency Report will reflect the fruits of this effort with a materially improved turnaround time. 

  • Harassment & Bullying: Based on reports, we have observed that Harassment & Bullying disproportionately occurs in chats, and particularly group chats. The improvements we introduced to Group Chat Reporting and Account Reporting help us take more comprehensive action when assessing reports in this reporting category. Additionally, as of this period, we require users to input a comment when submitting a harassment and bullying report. We review this comment to contextualize each report. Together, these changes led to material increases in total enforcements (+91%), total unique accounts enforced (+82%), and turnaround time (+245 mins) for corresponding reports. 

  • Hate Speech: In H1 2024, we observed increases in reported content, total enforcements, and turnaround time for Hate Speech. Specifically, we saw a 61% increase in in-app reports, a 127% increase in total enforcements, and a 125% increase in total unique accounts enforced. This was due, in part, to improvements in our chat reporting mechanisms (as previously discussed), and was further exacerbated by the geopolitical environment, particularly the continuation of the Israel-Hamas conflict. 

This reporting period, we saw a ~65% decrease in total enforcements and a ~60% decrease in total unique accounts enforced in response to reports of suspected Spam & Abuse, reflecting improvements in our proactive detection and enforcement tools. We saw similar declines in total enforcements in response to reports of content relating to Self Harm and Suicide (~80% decreases), reflecting our updated victim-centric approach, according to which our Trust & Safety teams will, in appropriate cases, send relevant users self-help resources, rather than take an enforcement action against those users. This approach was informed by members of our Safety Advisory Board, including a pediatric professor and medical doctor who specializes in interactive media and internet disorders.

Our Efforts to Proactively Detect and Enforce Against Violations of Our Community Guidelines

Proactive Detection & Enforcement of our Community Guidelines


We deploy automated tools to proactively detect and, in some cases, enforce against violations of our Community Guidelines. These tools include hash-matching tools (including PhotoDNA and Google Child Sexual Abuse Imagery (CSAI) Match), abusive language detection tools (which detect and enforce based on an identified and regularly updated list of abusive keywords and emojis), and multi-modal artificial intelligence / machine learning technology. 

In H1 2024, we took the following enforcement actions after proactively detecting, through the use of automated tools, violations of our Community Guidelines:

Combating Child Sexual Exploitation & Abuse 

Sexual exploitation of any member of our community, especially minors, is illegal, abhorrent, and prohibited by our Community Guidelines. Preventing, detecting, and eradicating Child Sexual Exploitation and Abuse (CSEA) on our platform is a top priority for Snap, and we continually evolve our capabilities to combat these and other crimes.

We use active technology detection tools, such as PhotoDNA robust hash-matching and Google’s Child Sexual Abuse Imagery (CSAI) Match to identify known illegal images and videos of CSEA, respectively. In addition, in some cases, we use behavioral signals to enforce against other potentially illegal CSEA activity. We report CSEA-related content to the U.S. National Center for Missing and Exploited Children (NCMEC), as required by law. NCMEC then, in turn, coordinates with domestic or international law enforcement, as required.

In the first half of 2024, we took the following actions upon detecting CSEA on Snapchat (either proactively or upon receiving a report):

*Note that each submission to NCMEC can contain multiple pieces of content. The total individual pieces of media submitted to NCMEC is equal to our total content enforced.

Our Efforts to Provide Resources and Support to Snapchatters in Need

We care deeply about the mental health and well-being of Snapchatters, which continues to inform our decisions to build Snapchat differently. As a platform designed for communications between and among real friends, we believe Snapchat can play a unique role in empowering friends to help each other in difficult times. This is why we have developed resources and support for Snapchatters in need. 

Our Here For You search tool shows resources from expert local partners when users search for certain topics related to mental health, anxiety, depression, stress, suicidal thoughts, grief and bullying. We have also developed a page dedicated to financial sextortion and other sexual risks and harms, in an effort to support those in distress. Our global list of safety resources is publicly available to all Snapchatters, in our Privacy, Safety & Policy Hub. 

When our Trust & Safety teams become aware of a Snapchatter in distress, they can forward self-harm prevention and support resources, and notify emergency response personnel when appropriate. The resources that we share are available on our global list of safety resources, and are publicly available to all Snapchatters.

Appeals

Below we provide information about appeals we received from users requesting a review of our decision to lock their account:

* As discussed in the “Analysis” section above, stopping the spread of content or activity related to child sexual exploitation is a top priority. Snap devotes significant resources toward this goal and has zero tolerance for such conduct. We have expanded our global vendor teams to adapt to new policies and reporting features for Snapchat. In doing so, between H2 2023 and H1 2024, we reduced the turnaround time for CSEA appeals from 152 days to 15 days. We continually strive to improve our processes, including in relation to appeal turnaround times.

지역 및 국가 개요

이 섹션에서는 일부 지역에서 앱 내 위반 신고에 대응하고 사전에 커뮤니티 가이드라인을 시행하기 위한 신뢰 및 안전팀의 조치를 간략히 설명합니다. 당사의 커뮤니티 가이드라인은 전 세계 모든 지역의 Snapchat 콘텐츠와 사용자에게 적용됩니다.

모든 EU 회원국을 포함한 개별 국가의 정보는 첨부된 CSV 파일에서 다운로드할 수 있습니다.

커뮤니티 가이드라인 시행을 위한 신뢰 및 안전팀의 조치 개요 

Community Guidelines Violations Reported to our Trust & Safety Teams

Proactive Detection and Enforcement of our Community Guidelines

Ads Moderation

Snap is committed to ensuring that all ads are fully compliant with our advertising policies. We believe in a responsible and respectful approach to advertising, creating a safe and enjoyable experience for all of our users. All ads are subject to our review and approval. In addition, we reserve the right to remove ads, including in response to user feedback, which we take seriously. 


Below we have included insight into our moderation for paid advertisements that are reported to us following their publication on Snapchat. Note that ads on Snapchat can be removed for a variety of reasons as outlined in Snap’s Advertising Policies, including deceptive content, adult content, violent or disturbing content, hate speech, and intellectual property infringement. Additionally, you can now find Snapchat’s Ads Gallery in the navigation bar of this transparency report.