For some time now there has been growing concern around how easily children can access social media and the type of content they encounter once they are there. In recent developments, that concern has now translated into legislative change.
Amendments to the Online Safety Act 2021 (Cth), which commenced on 10 December 2025, require social media platforms to take reasonable steps to restrict access for users under the age of 16. The obligation sits with the platforms, not with young people and not with parents.
The intention behind the reform is relatively clear and lawmakers are signalling that responsibility for managing age-appropriate access should not rest solely with families. Platforms that design and operate these digital spaces are now expected to take a more active role in limiting exposure to harm.
This sits within a broader shift in digital regulation. Community expectations around children’s participation online have been changing for years and the law is gradually adjusting to reflect that reality. Whether the practical impact of these changes will match their policy intention is something that will become clearer over time.
The legislative framework
When the Online Safety Act 2021 (Cth) was introduced, the focus was largely on harmful content and the powers of the eSafety Commissioner to intervene once something had gone wrong. The recent amendments take a different approach by turning attention to who is accessing social media in the first place.
The change centres on users under 16, with platforms now required to take reasonable steps to restrict their access. The Act does not prescribe a single model or verification method and that is likely intentional. What is reasonable will depend on the nature of the service, the risks it presents and how regulators interpret compliance in practice.
From a legal perspective, this is less about content moderation and more about structural responsibility. Rather than relying only on complaints and takedown mechanisms, the legislation expects platforms to consider age at the point of entry. How consistently that expectation can be enforced remains to be seen, but the direction of policy is clear.
Case study 1: Platform responsibility in digital spaces
A useful point of reference in this area is the High Court’s decision in Fairfax Media Publications Pty Ltd v Voller (2021) 273 CLR 346.
The issue before the Court was whether media organisations could be treated as publishers of comments posted by third parties on their public Facebook pages. The companies argued that they were not responsible for material written by others.
The High Court rejected that position, holding that by creating and maintaining the Facebook pages and inviting public interaction, the media companies had participated in publishing the comments. As a result, they were held liable for defamation for those third-party posts.
Although Voller did not arise under the Online Safety Act 2021 (Cth), it is significant for present purposes because it illustrates how responsibility can attach to those who operate and control digital spaces. The Court’s reasoning reflects an understanding that facilitating engagement is not a neutral act. Where an entity designs and maintains an online environment, it may bear legal consequences for what occurs within it. That approach sits comfortably alongside more recent legislative developments which place increasing emphasis on the role of platform operators in managing foreseeable risk.
Case study 2: Enforcement powers under the Online Safety Act
The enforcement powers that sit behind these amendments are not new. The Online Safety Act 2021 (Cth) already gives the eSafety Commissioner authority to issue removal notices, require platforms to comply with basic online safety expectations and to pursue civil penalties where statutory obligations are not met.
Those powers have been used in recent years, including in situations regarding harmful online content and transparency requirements imposed on major digital platforms. In practice, this demonstrates that the regulatory framework is not symbolic. The Commissioner has both investigative and enforcement tools available and has shown a willingness to use them.
Although earlier enforcement activity has not focused specifically on restricting access for users under 16, it provides context for how the new obligations may be approached. The age-based access requirements do not operate in isolation. They sit within an established compliance structure where failure to meet statutory standards can attract regulatory scrutiny and, where appropriate, civil penalty proceedings.
What the reforms do not do
There has been some misunderstanding about what these amendments actually do. They do not make it an offence for a young person to hold a social media account and they do not expose parents to fines if a child attempts to sign up.
The focus sits elsewhere, meaning the obligation is directed at the platforms themselves. In practical terms, the law is recognising that design choices and internal policies influence how children interact with digital spaces. Responsibility is therefore being framed at the corporate level rather than at the level of individual families.
For many households, this will not feel like a dramatic shift as conversations about screen time, supervision and online behaviour are already part of everyday parenting. The legislation does not displace that role. What it does attempt to ensure is that the environment children enter has been considered more carefully at the platform level.
Practical implications and evolving standards
One of the central questions arising from the reform is how platforms will demonstrate that they have taken reasonable steps.
Age verification mechanisms, identity checks, algorithmic monitoring and account functionality restrictions may all form part of compliance strategies.
Balancing effective age controls with privacy considerations and user rights will likely present practical challenges. As the amendments are applied, standards will continue to develop through regulatory interpretation and industry response.
Digital regulation often evolves through a combination of statutory wording and real-world implementation. It is therefore likely that further clarification will emerge over time as platforms adjust and compliance expectations are tested.
Looking ahead
Digital environments change quickly and regulatory frameworks must adapt in response. The amendments to the Online Safety Act 2021 (Cth) are a measured attempt to address growing concern about children’s safety by requiring platforms to take a more active role in managing age-appropriate access.
The practical effectiveness of these measures will depend on how they are implemented and overseen, as well as on how platforms respond to evolving expectations. As community standards shift, so too will the interpretation and application of legal requirements in this space.
We will continue to monitor how these reforms operate in practice and what they mean for young people, families and the digital services they use.
Aubrey Brown Lawyers advises businesses and organisations navigating privacy, digital regulation and compliance obligations. As online safety reforms continue to evolve, seeking advice early can help understand your responsibilities and reduce risk.
To arrange an appointment with our team, call (02) 4350 3333 or visit aubreybrown.com.au.