Australia is preparing to implement a world-first law that would prohibit children under the age of 16 from creating or using accounts on social media platforms. While initially targeted at mainstream social media giants such as Instagram, TikTok, and Snapchat, the scope of the legislation may now widen significantly to include platforms like Reddit, Twitch, Roblox, and even dating apps owned by companies like Match Group.
The law, which is due to come into effect in December, aims to curb the negative effects of social media use among children, including exposure to harmful content, cyberbullying, and mental health deterioration. But as the government moves into the implementation phase, the task of defining what qualifies as a “social media service” is proving to be more complex than expected.
At the heart of the issue is the legislation’s broad definition of social media. Any digital platform that allows users to interact socially—by sharing content, commenting, messaging, or forming communities—could potentially fall under the new rules. This puts platforms not traditionally thought of as social media, such as gaming and streaming services, under scrutiny.
Among the 16 additional companies being assessed are Reddit, known for its community-based forums; Twitch, a live-streaming service primarily used for video game content; Roblox, an online game creation and playing platform with strong social components; and Match Group, which operates dating apps such as Tinder and Hinge.
These services, while not designed specifically for young audiences, often have user bases that include minors. Roblox, in particular, is extremely popular with children and allows users to chat, form friend groups, and interact within virtual worlds. Similarly, Twitch allows live interaction through chat features, while Reddit hosts a wide range of user-generated communities—some of which are not moderated to child-safe standards.

The government has asked these platforms to self-assess and determine whether they fit the definition of a social media service under the law. If deemed covered, they will be required to take “reasonable steps” to prevent users under the age of 16 from creating accounts or accessing certain features. Failing to comply could result in heavy penalties, including fines of up to $50 million.
The implications are vast. Platforms will need to implement robust age verification systems—technologies that are still being refined and often criticised for being intrusive or unreliable. Biometric checks, facial recognition, document uploads, and third-party verification services are all being considered, raising significant privacy concerns for both users and data regulators.
While the intent of the law—to protect children online—is widely supported, critics argue that such a sweeping approach could lead to unintended consequences. Some warn that children may simply migrate to less regulated or underground platforms, or lie about their age to access services, as they already do. Others argue that for some young users, social platforms offer vital communities, educational content, or mental health support—resources that could be lost under a blanket ban.
The gaming community is particularly concerned. Platforms like Roblox and Steam blur the line between entertainment and social interaction. While their primary purpose is gaming, they also include chat functions, friend lists, forums, and marketplaces, all of which could be seen as social features. Game developers and industry experts fear that overregulation could harm innovation and limit access to educational or creative tools that help children learn coding, design, or storytelling.
Dating platforms, too, are being examined, even though their terms of service already prohibit underage users. The government is questioning whether enough is being done to enforce age restrictions and whether more accountability is needed to ensure young people are not accessing adult-focused services.
Messaging apps like WhatsApp and iMessage may be exempt, as they are considered private communication tools rather than public or semi-public social networks. Similarly, platforms used primarily for education or health communication are not expected to be affected. However, the line is not always clear, and regulators will need to make judgment calls based on how a service is used in practice, not just how it is described in policy documents.
Industry representatives are calling for clearer guidance and express concern that the ambiguity surrounding the law’s definitions will lead to confusion, overcompliance, or legal disputes. Smaller platforms, in particular, may struggle with the cost of implementing age-assurance technology or adapting their services to meet regulatory expectations.
As the December deadline approaches, platforms are expected to submit formal assessments of whether they believe the law applies to them. The eSafety Commissioner will then decide which services are subject to the new requirements, and which may qualify for exemptions. Final decisions are likely to be followed closely by both the public and international observers, as Australia sets a global precedent for online child safety legislation.
In the meantime, parents, educators, and child safety advocates are watching closely. Supporters of the law argue that it is a necessary intervention in an online landscape that has grown faster than the rules governing it. Opponents worry that such sweeping regulation risks overreach and could drive younger users into riskier or less transparent corners of the internet.
What is clear is that Australia is no longer leaving the regulation of children’s online activity to tech companies alone. The era of voluntary age limits may be coming to an end—and what replaces it could reshape the digital lives of millions of young users, in Australia and beyond.








