Snap Values

Reflections on Australia's Social Media Minimum Age Law, Two Months In

February 2, 2026

Two months into the implementation of Australia's Social Media Minimum Age (SMMA) law, Snapchat remains fully committed to complying with the legislation and supporting its underlying goal of improving online safety for young Australians. As we've worked to implement these measures, we've gained important insights about the potential limitations of this law as it currently stands. We wanted to share where we are in our compliance efforts and what we believe needs to happen next to strengthen protection for young people online.

Our Compliance Efforts

Snapchat has taken significant steps to meet the requirements of the SMMA. As of the end of January 2026, we have locked or disabled over 415,000 Snapchat accounts in Australia belonging to users who either declared an age under 16 or who we believe to be under 16 based on our age detection technology. We continue to lock more accounts daily.

But we are finding that the law’s current implementation approach still leaves significant gaps that could undermine its goals.

First, there are real technical limitations to accurate and dependable age verification. The Australian government's own trial, published in 2025, found that available age estimation technology was only accurate to within 2-3 years on average. In practice, this means some young people under 16 may be able to bypass protections, potentially leaving them with reduced safeguards, while others over 16 may incorrectly lose access.

Second, the current approach lacks industry wide protections, leaving vulnerabilities with hundreds of other apps that are either not in-scope of this law or where it is unclear. Young people won’t stop communicating when they lose access to regulated services. Over 75% of time spent on Snapchat in Australia is messaging with close friends and family. We're concerned that when young people are cut off from these communication tools, some may turn to alternative messaging services that are not being regulated — services that may be less well-known and offer fewer safety protections than Snapchat provides. While we don't yet have data to quantify this shift, it's a risk that deserves serious consideration as policymakers evaluate whether the law is achieving its intended outcomes.

A Solution: App Store-Level Age Verification

This is why we have been advocating for app store-level age verification as an additional safeguard to bolster the SMMA's implementation in a way that is less likely to have negative unintended consequences.

App store-level age verification would help address multiple risks and gaps. First, it would give in-scope apps more consistent age signals for each device, helping ensure that users under 16 are kept off the app while reducing the risk that users over 16 are incorrectly locked out. Second, it would strengthen safety across the entire digital ecosystem — not just for select regulated apps, but for all services. By creating a more universal foundation for age assurance, app store-level verification would help ensure that young people encounter appropriate protections no matter where they go online.

This approach could be a valuable worldwide standard. Rather than blanket age-based social media bans, app store-level age assurance could help the entire ecosystem protect young users more consistently and deliver developmentally appropriate experiences while allowing them to enjoy the benefits of social media.

Our Position on the SMMA 

We want to be clear: we still don't believe an outright ban for those under 16 is the right approach. We understand the Australian government's objectives and share the goal of protecting young people online. But in the case of Snapchat — which is primarily a messaging app used by young people to stay connected with close friends and family — we do not believe that cutting teens off from these relationships makes them safer, happier, or otherwise better off. We fundamentally disagree that Snapchat is an in-scope age-restricted social media platform.

Despite our disagreement with the policy itself, we believe it's important to engage constructively and suggest ways to improve its implementation and reduce negative unintended effects. If Australia is going to pursue this approach, it should be done in a way that offers greater protection to young people with fewer downsides. Creating a centralized verification system at the app-store level would allow for more consistent protection and higher barriers to circumventing the law.

Our Ongoing Commitment to Safety

In the meantime, we continue building safety protections that will keep young Snapchatters safe in Australia and around the world. Snapchat includes safeguards like requiring bi-directional friend or contact book connections for one-to-one communication and maintaining 24/7 Trust & Safety teams, including a full-time team based in Sydney.

We also provide comprehensive parental tools through Family Center, which we recently expanded with new features that give parents deeper visibility into their teens' Snapchat use. Parents can now see how much time their teen spends on the platform each day and how that time breaks down across different features — whether chatting with friends, creating with the camera, or exploring content. When teens add new friends, parents can see how their teen might know them, including whether they have mutual friends or are in their contact book. These insights help parents have more informed conversations with their teens about their online lives and ensure they're connecting with people they know in real life.

Moving Forward

Snapchat will continue working with the Australian government to comply with the SMMA. But we also believe protecting young people online requires careful thought about the gaps that exist with current implementation and work to close them.

We're advocating for app store-level age verification not because we support the U16 ban, but because if this policy is going to exist, it should be implemented in a consistent way that is most likely to have more upside and fewer downside risks — in a way that keeps teens under the age of 16 off regulated apps while ensuring they're not simply pushed toward less safe alternatives. Young Australians deserve a comprehensive approach to online safety.

กลับไปสู่ข่าว