Meet Our Head of Global Platform Safety
Hello, Snapchat community! My name is Jacqueline Beauchere and I joined Snap last Fall as the company’s first Global Head of Platform Safety.
My role focuses on enhancing Snap’s overall approach to safety, including creating new programs and initiatives to help raise awareness of online risks; advising on internal policies, product tools and features; and listening to and engaging with external audiences – all to help support the safety and digital well-being of the Snapchat community.
Since my role involves helping safety advocates, parents, educators and other key stakeholders understand how Snapchat works and to solicit their feedback, I thought it might be useful to share some of my initial learnings about the app; what surprised me; and some helpful tips, if you or someone close to you is an avid Snapchatter.
Initial Learnings – Snapchat and Safety
After more than 20 years working in online safety at Microsoft, I’ve seen significant change in the risk landscape. In the early 2000s, issues like spam and phishing highlighted the need for awareness-raising to help educate consumers and minimize socially engineered risks. The advent of social media platforms – and people’s ability to post publicly – increased the need for built-in safety features and content moderation to help minimize exposure to illegal and potentially more harmful content and activity.
Ten years ago, Snapchat came onto the scene. I knew the company and the app were “different,” but until I actually started working here, I didn’t realize just how different they are. From inception, Snapchat was designed to help people communicate with their real friends – meaning people they know “in real life” – rather than amassing large numbers of known (or unknown) followers. Snapchat is built around the camera. In fact, for non-first-generation Snapchatters (like me), the app’s very interface can be a bit mystifying because it opens directly to a camera and not a content feed like traditional social media platforms.
There’s far more that goes into Snapchat’s design than one might expect, and that considered approach stems from the tremendous value the company places on safety and privacy. Safety is part of the company’s DNA and is baked into its mission: to empower people to express themselves, live in the moment, learn about the world and have fun together. Unless people feel safe, they won’t be comfortable expressing themselves freely when connecting with friends.
The belief that technology should be built to reflect real-life human behaviors and dynamics is a driving force at Snap. It’s also vital from a safety perspective. For example, by default, not just anyone can contact you on Snapchat; two people need to affirmatively accept each other as friends before they can begin communicating directly, similar to the way friends interact in real life.
Snap applies privacy-by-design principles when developing new features and was one of the first platforms to endorse and embrace safety-by-design, meaning safety is considered in the design phase of our features – no retro-fitting or bolting on safety machinery after the fact. How a product or feature might be misused or abused from a safety perspective is considered, appropriately so, at the earliest stages of development.
What Surprised Me – Some Context Behind Some Key Features
Given my time in online safety and working across industry, I’d heard some concerns about Snapchat. Below are a handful of examples and what I’ve learned over the past few months.
Content that Deletes by Default
Snapchat is probably most known for one of its earliest innovations: content that deletes by default. Like others, I made my own assumptions about this feature and, as it turns out, it’s something other than I’d first presumed. Moreover, it reflects the real-life-friends dynamic.
Snapchat’s approach is rooted in human-centered design. In real life, conversations between and among friends aren’t saved, transcribed or recorded in perpetuity. Most of us are more at ease and can be our most authentic selves when we know we won’t be judged for every word we say or every piece of content we create.
One misperception I’ve heard is that Snapchat’s delete-by-default approach makes it impossible to access evidence of illegal behavior for criminal investigations. This is incorrect. Snap has the ability to, and does, preserve content existing in an account when law enforcement sends us a lawful preservation request. For more information about how Snaps and Chats are deleted, see this article.
Strangers Finding Teens
A natural concern for any parent when it comes to online interactions is how strangers might find their teens. Again, Snapchat is designed for communications between and among real friends; it doesn’t facilitate connections with unfamiliar people like some social media platforms. Because the app was built for communicating with people we already know, by design, it’s difficult for strangers to find and contact specific individuals. Generally, people who are communicating on Snapchat have already accepted each other as friends. In addition, Snap has added protections to make it even more difficult for strangers to find minors, like banning public profiles for those under 18. Snapchat only allows minors to surface in friend-suggestion lists (Quick Add) or Search results if they have friends in common.
A newer tool we want parents and caregivers to be aware of is Friend Check-Up, which prompts Snapchatters to review their friend lists to confirm those included are still people they want to be in contact with. Those you no longer want to communicate with can easily be removed.
Snap Map and Location-Sharing
Along those same lines, I’ve heard concerns about the Snap Map – a personalized map that allows Snapchatters to share their location with friends, and to find locally relevant places and events, like restaurants and shows. By default, location-settings on Snap Map are set to private (Ghost Mode) for all Snapchatters. Snapchatters have the option of sharing their location, but they can do so only with others whom they’ve already accepted as friends – and they can make location-sharing decisions specific to each friend. It’s not an “all-or-nothing” approach to sharing one’s location with friends. Another Snap Map plus for safety and privacy: If people haven’t used Snapchat for several hours, they’re no longer visible to their friends on the map.
Most importantly from a safety perspective, there’s no ability for a Snapchatter to share their location on the Map with someone they’re not friends with, and Snapchatters have full control over the friends they choose to share their location with or if they want to share their location at all.
Harmful Content
Early on, the company made a deliberate decision to treat private communications between friends, and public content available to wider audiences, differently. In the more public parts of Snapchat, where material is likely to be seen by a larger audience, content is curated or pre-moderated to prevent potentially harmful material from “going viral.” Two parts of Snapchat fall into this category: Discover, which includes content from vetted media publishers and content creators, and Spotlight, where Snapchatters share their own entertaining content with the larger community.
On Spotlight, all content is reviewed with automated tools, but then undergoes an extra layer of human moderation before it is eligible to be seen, currently, by more than a couple dozen people. This helps to ensure the content complies with Snapchat’s policies and guidelines, and helps to mitigate risks that may have been missed by automoderation. By seeking to control virality, Snap lessens the appeal to publicly post illegal or potentially harmful content, which, in turn, leads to significantly lower levels of exposure to hate speech, self-harm and violent extremist material, to name a few examples – as compared with other social media platforms.
Exposure to Drugs
Snapchat is one of many online platforms that drug dealers are abusing globally and, if you’ve seen any media coverage of parents and family members who’ve lost children to a fentanyl-laced counterfeit pill, you can appreciate how heartbreaking and terrifying this situation can be. We certainly do, and our hearts go out to those who’ve lost loved ones to this frightening epidemic.
Over the past year, Snap has been aggressively and comprehensively tackling the fentanyl and drug-related content issue in three key ways:
Developing and deploying new technology to detect drug-related activity on Snapchat to, in turn, identify and remove drug dealers who abuse the platform;
Reinforcing and taking steps to bolster our support for law enforcement investigations, so authorities can quickly bring perpetrators to justice; and
Raising awareness of the dangers of fentanyl with Snapchatters via public service announcements and educational content directly in the app. (You can learn more about all of these efforts here.)
We’re determined to make Snapchat a hostile environment for drug-related activity and will continue to expand on this work in the coming months. In the meantime, it’s important for parents, caregivers and young people to understand the pervasive threat of potentially fatally fake drugs that has spread across online platforms, and to talk with family and friends about the dangers and how to stay safe.
Snap has much planned on the safety and privacy fronts in 2022, including launching new research and safety features, as well as creating new resources and programs to inform and empower our community to adopt safer and healthier digital practices. Here’s to the start of a productive New Year, chock-full of learning, engagement, safety and fun!
- Jacqueline Beauchere, Global Head of Platform Safety