From 10 December 2025, platforms that are classified as age-restricted social media platforms (“ARSMPs”) under Part 4A of the Online Safety Act 2021 (Cth), as amended by the Online Safety Amendment (Social Media Minimum Age) Act 2024 (Cth) (“OSA”), must be able to demonstrate they are taking “reasonable steps” to prevent children in Australia under 16 from holding accounts on their services. Failure to do so exposes companies to investigation and enforcement by the eSafety Commissioner.
On 16 September 2025, the Commissioner issued the Social Media Minimum Age Regulatory Guidance (“Guidance”) on what “reasonable steps” are to prevent those aged under 16 from holding accounts. The Guidance makes it clear that businesses will be judged on whether they have adopted a proportionate, layered and accountable strategy — not whether they rely on a single technical fix. For in-scope companies, this means that time is running out to detect existing under-age users, review sign-up processes, deploy workable age-assurance measures and put governance frameworks in place before regulators begin to scrutinize compliance. Waiting until enforcement begins risks reputational damage, regulatory intervention and costly penalties under the Commissioner’s expanded powers.
While some platforms are exempt, such as those whose sole or primary purpose is messaging, professional networking, sharing information about products or services, education, health or online gaming, many other social media services remain squarely in scope. The restriction applies only to holding accounts — under-16s can still view publicly available content without logging in — but the compliance burden sits with platforms to evidence their approach.
For background on how age assurance technologies were tested and evaluated in Australia, see our article on the recent Age Assurance Technology Trial here.
Part 4A of the OSA builds on Australia’s existing online safety framework, requiring ARSMPs to prevent children under 16 from holding accounts on captured platforms. Unlike earlier reliance on self-declared age, this new regime requires platforms to use robust, verifiable, and rights-respecting measures.
The Commissioner’s Guidance outlines:
The Guidance also confirms that platforms will be expected to detect and deactivate or remove existing underage accounts, with care and clear communication to affected users. While platforms must notify account holders, the Guidance makes clear that they are not required to delete content already posted by under-16 users. Instead, platforms should consider offering options to preserve or manage that content. Taking this approach may help demonstrate the level of “care” regulators expect under the regime, and avoid unnecessary user disruption or reputational risk.
Crucially, this Guidance also sets out what reasonable steps are not intended to require.
The Guidance confirms that “reasonable steps” will be assessed holistically, considering the measures adopted, the way they are implemented and the risk of harm presented by the Platform. Four key themes emerge:
a. Proportionality and Risk-Based Approach Platforms are expected to calibrate their measures to the nature of their service and associated risks. For higher-risk features (e.g. posting functionality or livestreaming), regulators will expect stronger forms of age assurance. Lower-risk functions may be supported by lighter-touch measures, provided they are part of a coherent overall strategy. |
b. Layered and Flexible Solutions
No single tool will suffice. Platforms must adopt layered approaches — combining methods such as biometric estimation, identity document verification, and contextual signals — to increase reliability and reduce circumvention. Flexibility to adapt to evolving threats and technologies is also part of the assessment. |
c. Privacy and Data Protection
“Reasonable steps” must be consistent with Australian privacy law. Data minimisation, on-device processing, and strict retention limits are central expectations. Solutions that unnecessarily retain biometric or identity information will weigh against a platform in compliance assessments. |
d. Governance and Accountability
The Guidance stresses that platforms must demonstrate robust governance frameworks. This includes:
|
e. Removal, Prevention and Review Platforms will need to implement measures to prevent re-registration or circumvention by underage users whose accounts are removed. This may require steps to counter use of VPNs or errors by age-assurance tools. At the same time, platforms must ensure that users incorrectly restricted (for example, those aged over 16 mis-identified by age assurance) are provided with accessible review mechanisms to restore accounts. Importantly, the eSafety Commissioner has signalled that expectations in this area will be ever-evolving. ARSMPs cannot take a one-off compliance approach: they will be expected to continually monitor and update their measures as tools and technologies develop, and as industry standards mature. This means building internal governance processes that keep pace with change, ensuring systems are regularly reviewed and that compliance strategies remain defensible over time. |
The Guidance is clear that “reasonable steps” do not mean platforms must:
This balance is intentional: regulators are seeking accountable, evidence-based approaches that respect children’s rights and user privacy, rather than imposing a rigid compliance model.
The Guidance provides clear signals about what regulators will look for in practice:
The eSafety Commissioner’s Guidance makes clear that “reasonable steps” are not a check-box exercise. Platforms must take a holistic, proportionate, and evidence-based approach that balances safety with privacy and inclusivity.
The clear call to action is that ARSMPs must now implement a proportionate, layered and evidence-based age assurance program that balances safety, privacy and inclusivity. This means moving quickly to embedding age-assurance tools that respect privacy, building governance structures that withstand regulatory scrutiny, and establishing processes for ongoing monitoring as technologies evolve.
With the 10 December 2025 deadline fast approaching, the question is not whether compliance solutions exist, but whether ARSMPs can deploy them responsibly and transparently in time to meet expectations and deliver safer online spaces for young people in Australia.