Senate Congressional Testimony — Our Approach to Safety, Privacy and Wellbeing

October 26, 2021

Today, our VP of Global Public Policy, Jennifer Stout, joined other tech platforms in testifying before the Senate Commerce Committee’s Subcommittee on Consumer Protection, Product Safety and Data Security about Snap’s approach to protecting young people on our platform. 

We were grateful for the opportunity to explain to the Subcommittee how we intentionally built Snapchat differently from traditional social media platforms, how we work to build safety and privacy directly into the design of our platform and products, and where we need to continue to improve to better protect the wellbeing of our community. We have always believed that we have a moral responsibility to put their interests first — and believe that all tech companies must take responsibility and actively protect the communities they serve. 

We welcome the Subcommittee’s ongoing efforts to investigate these critical issues — and you can read Jennifer’s full opening statement below. A PDF of the full testimony is available here.

****

Testimony of Jennifer Stout Vice President of Global Public Policy, Snap Inc

Introduction

Chairman Blumenthal, Ranking Member Blackburn, and members of the Subcommittee, thank you for the opportunity to appear before you today. My name is Jennifer Stout and I serve as the Vice President of Global Public Policy at Snap Inc., the parent company of Snapchat. It’s an honor and privilege to be back in the Senate 23 years after first getting my start in public service as a Senate staffer, this time in a much different capacity — to speak about Snap’s approach to privacy and safety, especially as it relates to our youngest community members. I have been in this role for nearly five years, after spending almost two decades in public service, more than half of which was spent in Congress. I have tremendous respect for this institution and the work you and your staff are doing to make sure that tech platforms ensure that our youth are having safe and healthy online experiences. 

To understand Snap’s approach to protecting young people on our platform, it’s helpful to start at the beginning. Snapchat’s founders were part of the first generation to grow up with social media. Like many of their peers, they saw that while social media was capable of making a positive impact, it also had certain features that negatively impacted their friendships. These platforms encouraged people to publicly broadcast their thoughts and feelings, permanently. Our founders saw how people were constantly measuring themselves against others through “likes” and comments, trying to present a version of themselves through perfectly curated images, and carefully scripting their content because of social pressure. Social media also evolved to feature an endless feed of unvetted content, exposing people to a flood of viral, misleading, and harmful content. 

Snapchat was built as an antidote to social media. In fact, we describe ourselves as a camera company. Snapchat’s architecture was intentionally designed to empower people to express a full range of experiences and emotions with their real friends, not just the pretty and perfect moments. In the formative years of our company, there were three major ways our team pioneered new inventions to prioritize online privacy and safety. 

First, we decided to have Snapchat open to a camera instead of a feed of content. This created a blank canvas for friends to visually communicate with each other in a way that is more immersive and creative than sending text messages. 

Second, we embraced strong privacy principles, data minimization, and the idea of ephemerality, making images delete-by-default. This allowed people to genuinely express themselves in the same way they would if they were just hanging out at a park with their friends. Social media may have normalized having a permanent record of conversations online, but in real life, friends don’t break out their tape recorder to document every single conversation for public consumption or permanent retention. 

Third, we focused on connecting people who were already friends in real life by requiring that, by default, both Snapchatters opt-in to being friends in order to communicate. Because in real life, friendships are mutual. It’s not one person following the other, or random strangers entering our lives without permission or invitation. 

A Responsible Evolution

Since those early days, we have worked to continue evolving responsibly. Understanding the potential negative effects of social media, we made proactive choices to ensure that all of our future products reflected those early values. 

We didn’t need to reinvent the wheel to do that. Our team was able to learn from history when confronting the challenges posed by new technology. As Snapchat evolved over time, we were influenced by existing regulatory frameworks that govern broadcast and telecommunications when developing the parts of our app where users could share content that has the potential to reach a large audience. For instance, when you talk to your friends on the phone, you have a high expectation of privacy, whereas if you are a public broadcaster with the potential to influence the minds and opinions of many, you are subject to different standards and regulatory requirements. 

That dichotomy helped us to develop rules for the more public portions of Snapchat that are inspired by broadcast regulations. These rules protect our audience and differentiate us from other platforms. For example, Discover, our closed content platform where Snapchatters get their news and entertainment, exclusively features content from either professional media publishers who partner with us, or from artists, creators, and athletes who we choose to work with. All of these content providers have to abide by our Community Guidelines, which apply to all of the content on our platform. But Discover publisher partners also must abide by our Publisher Guidelines, which include requiring that content is fact-checked or accurate and age-gated when appropriate. And for individual creators featured in Discover, our human moderation teams review their Stories before we allow them to be promoted on the platform. While we use algorithms to feature content based on individual interests, they are applied to a limited and vetted pool of content, which is a different approach from other platforms.

On Spotlight, where creators can submit creative and entertaining videos to share with the broader Snapchat community, all content is first reviewed automatically by artificial intelligence before gaining any distribution, and then human-reviewed and moderated before it can be viewed by more than 25 people. This is done to ensure that we reduce the risk of spreading misinformation, hate speech, or other potentially harmful content.

We don’t always get it right the first time, which is why we redesign parts of Snapchat when they aren’t living up to our values. That’s what happened in 2017 when we discovered that one of our products, Stories, was making Snapchatters feel like they had to compete with celebrities and influencers for attention because content from celebrities and friends were combined in the same user interface. As a result of that observation, we decided to separate “social” content created by friends from “media'' content created by celebrities to help reduce social comparison on our platform. This redesign negatively impacted our user growth in the short-term, but it was the right thing to do for our community.

Protecting Young People on Snapchat

Our mission — to empower people to express themselves, live in the moment, learn about the world, and have fun together — informed Snapchat’s fundamental architecture. Adhering to this mission has enabled us to create a platform that reflects human nature and fosters real friendships. It continues to influence our design processes and principles, our policies and practices, and the resources and tools we provide to our community. And it undergirds our constant efforts to improve how we address the inherent risks and challenges associated with serving a large online community. 

A huge part of living up to our mission has been building and maintaining trust with our community and partners, as well as parents, lawmakers, and safety experts. Those relationships have been built through the deliberate, consistent decisions we have made to put privacy and safety at the heart of our product design process. 

For example, we have adopted responsible design principles that consider the privacy and safety of new products and features right from the beginning of the development process. And we've made those principles come to life through rigorous processes. Every new feature in Snapchat goes through a defined privacy and safety review, conducted by teams that span Snap — including designers, data scientists, engineers, product managers, product counsel, policy leads, and privacy engineers — long before it sees the light of day.

While more than 80% of our community in the United States is 18 or older, we have spent a tremendous amount of time and resources to protect teenagers. We’ve made thoughtful and intentional choices to apply additional privacy and safety policies and design principles to help keep teenagers safe. That includes:

  • Taking into account the unique sensitivities and considerations of minors when we design products. That’s why we intentionally make it harder for strangers to find minors by banning public profiles for people under 18 and are rolling out a feature to limit the discoverability of minors in Quick Add (friend suggestions). And why we have long deployed age-gating tools to prevent minors from viewing age-regulated content and ads. 

  • Empowering Snapchatters by providing consistent and easy-to-use controls like turning location sharing off by default and offering streamlined in-app reporting for users to report concerning content or behaviors to our Trust and Safety teams. Once reported, most content is actioned in under 2 hours to minimize the potential for harm. 

  • Working to develop tools that will give parents more oversight without sacrificing privacy — including plans to provide parents the ability to view their teen's friends, manage their privacy and location settings, and see who they're talking to.

  • Investing in educational programs and initiatives that support the safety and mental health of our community — like Friend Check Up and Here for You. Friend Check Up prompts Snapchatters to review who they are friends with and make sure the list is made up of people they know and still want to be connected with. Here for You provides support to users who may be experiencing mental health or emotional crises by providing tools and resources from experts.

  • Preventing underage use. We make no effort — and have no plans — to market to children, and individuals under the age of 13 are not permitted to create Snapchat accounts. When registering for an account, individuals are required to provide their date of birth, and the registration process fails if a user inputs an age under the age of 13. We have also implemented a new safeguard that prevents Snapchat users between 13-17 with existing accounts from updating their birthday to an age of 18 or above. Specifically, if a minor attempts to change their year of birth to an age over 18, we will prevent the change as a way to ensure that users are not accessing age-inappropriate content within Snapchat.

Conclusion and Looking Ahead

We're always striving for new ways to keep our community safe, and we have more work left to do. We know that online safety is a shared responsibility, spanning a host of sectors and actors. We are committed to doing our part in concert with safety partners including our Safety Advisory Board, technology industry peers, government, and civil society. From technology-focused and awareness-raising initiatives, to research and best practice sharing, we are actively engaged with organizations dedicated to protecting minors online. We also know that there are many complex problems and technical challenges across our industry, including age verification of minors, and we remain committed to working with partners and policymakers to identify robust industry-wide solutions.         

Protecting the wellbeing of Snapchatters is something we approach with both humility and steadfast determination. Over 500 million people around the world use Snapchat every month and while 95% of our users say Snapchat makes them feel happy, we have a moral responsibility to take into account their best interests in everything we do. That’s especially true as we innovate with augmented reality — which has the potential to positively contribute to the way we work, shop, learn, and communicate. We will apply those same founding values and principles as we continue to experiment with new technologies like the next generation of augmented reality. 

As we look to the future, computing and technology will become increasingly integrated into our daily lives. We believe that regulation is necessary but given the speed at which technology develops and the rate at which regulation can be implemented, regulation alone can’t get the job done. Technology companies must take responsibility and actively protect the communities they serve. 

If they don't, the government must act swiftly to hold them accountable. We fully support the Subcommittee’s efforts to investigate these issues and welcome a collaborative approach to problem solving that keeps our society safe. 

Thank you again for the opportunity to appear before you today and discuss these critical issues. I look forward to answering your questions.

Tagasi uudistesse