Back to school and the importance of reporting safety issues

September 3, 2024

It’s back-to-school in many parts of the world and what better time to remind teens, parents, and educators about the importance of reporting safety concerns to online platforms and services. 

Unfortunately, reporting has gotten a bit of a “bad rap” over the years, as young people have come to normalize exposure to problematic content and conduct online, or equate reporting with tattle-taling. And, those sentiments are borne out in data. Results from our latest Digital Well-Being research show that while more teens and young adults talked to someone or took action after experiencing an online risk this year, only about one in five reported the incident to the online platform or service. Reporting problematic content and accounts is critically important to help tech companies remove bad actors from their services and thwart further activity before it potentially causes harm to others.    

Survey results show that nearly 60% of Generation Z teens and young adults in six countries 1 who encountered an online risk on any platform or service – not solely Snapchat – talked to someone or sought help after the incident. That’s a welcome nine-percentage-point jump from 2023. Yet, only 22% reported the issue to the online platform or service, and only 21% reported to a hotline or helpline, like the U.S. National Center for Missing and Exploited Children (NCMEC) or the UK’s Internet Watch Foundation (IWF). Seventeen percent reported to law enforcement. Unfortunately, another 17% didn’t tell anyone about what happened.        

Why are young people reluctant to talk with someone or file a report? Data shows that a whopping 62% – that’s nearly two-thirds of teens (65%) and 60% of young adults – said they didn’t think the incident was a problem, and instead chalked it up to something that “happens to people online all the time.” A quarter (26%) said they didn’t think the perpetrator would face any consequences. Feelings of shame, guilt, or embarrassment (17%); fear of being negatively judged (15%); and not wanting a friend or family member to “get in trouble” (12%) were the other top-ranking reasons for failing to report. This brings into question some young people’s assessments of online content moderation: A quarter of respondents said they didn’t think anything would happen to the perpetrator, yet more than one in 10 said they didn’t want a friend or family member to be sanctioned for violating behavior. Smaller percentages blamed themselves for the incident (10%) or feared retaliation from the perpetrator (7%).     

Reporting on Snapchat

In 2024 and beyond, we’re out to bust myths and turn the tide on reporting on Snapchat, and we’re invoking the help of our new Council for Digital Well-Being (CDWB), a cohort of 18 teens from across the U.S., selected to help promote online safety and digital well-being in their schools and communities. 

“There is a blurry line between privacy and user safety,” notes Jeremy, a 16-year-old CDWB member from California. “The report button is what makes that blurry line clearer. It helps make Snapchat a safer place while maintaining privacy for all. This is why everyone must use the report button when necessary – to help make Snapchat a safer place.”

Josh, another California teen on the CDWB, agreed, highlighting three primary benefits of reporting on any platform or service: To help prevent the spread of illegal and potentially harmful content; to remove fake or impersonating accounts; and to help stop the proliferation of misinformation. Both teens are making the importance of reporting a key focus of their CDWB experience over the next year.  

When considering Snapchat, however, many of the concerns highlighted in the research don't really apply. For example, on our service, reporting is confidential. We don’t tell a reported user who filed a report about their content or behavior. We also acknowledge reports when we receive them and, for those who provided us with a confirmed email address, we tell reporters whether their submission actually identified a policy violation. This is all part of an ongoing effort to help educate our community about the conduct and content that are permitted and prohibited on our app. In addition, last month, we released a new feature called “My Reports” that provides all Snapchatters with the ability to track the status of their Trust and Safety-related in-app abuse reports submitted in the last 30 days. In “Settings,” under “My Account,” simply scroll down to “My Reports” and click to have a look.   

Prohibited content and actions are detailed in our Community Guidelines, and we always want to encourage accurate and timely reporting. On a private messaging-focused app like Snapchat, reporting by the community is vital. We can’t help address an issue unless we know it’s happening. And, as our CDWB members note, reporting can help not just the target of a potential violation, but other possible victims of the same bad actor(s), as well. At Snap, we consider reporting to be a “community service.” Snapchatters can report in-app by simply pressing and holding on a piece of content or by filling out this form at our Support Site. 

Parents, caregivers, educators, and school officials can also leverage the public webform, and those using our Family Center suite of parental tools can report concerning accounts directly in the feature. We also recently launched this Educators’ Guide to Snapchat to further assist school officials in fostering healthy and supportive digital environments for their students. For more on how to report regardless of whether you have a Snapchat account, see this Fact Sheet.   

Promoting positive online experiences

Fostering safer, healthier, and more positive online experiences on Snapchat and across the tech ecosystem is a top priority at Snap, and nothing is more important than the safety and well-being of our community. Gaining a better understanding of Snapchatters’ attitudes and behaviors, as well as those who use other platforms, is critical to advancing that goal and the very impetus behind our ongoing research. 

Full results from our Year Three study, including our latest Digital Well-Being Index, will be released in conjunction with international Safer Internet Day 2025. We’re sharing some early findings in the back-to-school timeframe to remind families and school communities about the importance of staying safe online.

We look forward to sharing more in the months leading up to – and on – Safer Internet Day 2025, February 11. Until then, let’s head back to school promoting online safety, and be ready and willing to report anything that may be of concern – to Snapchat or any online service.

— Jacqueline F. Beauchere, Global Head of Platform Safety

Tilbake til Nyheter