Out-Law / Your Daily Need-To-Know

Out-Law Analysis 4 min. read

Planning for age assurance: the challenges facing the introduction of Australia’s social media age restriction laws


Age assurance to protect children from harmful online content has come sharply into focus this year in Australia and globally. Recent developments and debate about Australia’s new social media age restriction laws highlight the challenges of effectively restricting access to social media for under 16 year olds, which will come into force in less than four months.

The new law, in part 4A of the Online Safety Act 2021, will require age restricted social media platforms to take ’reasonable steps’ to prevent users who are under the age of 16 from having accounts, although they will still be able to access the platforms without logging in.

These platforms can be fined up to A$49.5 million (approx. US$32.25 million) if they cannot demonstrate that they have taken ‘reasonable steps’ to put in place systems or methods to identify and block underage users.

However, when it comes to the specifics of how the ban will be effectively implemented, there remains unanswered questions. Julie Inman Grant, Australia’s eSafety Commissioner, in a recent National Press Club speech, said that although she is confident of meeting the 10 December deadline for implementing the ban,” we may be building the plane a little bit as we're flying it". 

Preliminary age assurance findings 

Alongside managing privacy and security risks, how to assess a user’s age will be a key challenge for platforms when children commonly find workarounds, often providing fake dates of birth to register for social media accounts.

An independent software consultancy firm commissioned by the Australian government commissioned to carry out an age assurance trial, has tested several age verification, estimation and inference technologies and their age assurance systems and methods. The preliminary findings from the trial claim that the technologies can be effective, privacy-preserving and robust when implemented appropriately, however, the preliminary report did not disclose details about the 53 technologies tested or their individual performance.

The final report on the trial, which we are told runs to 10 volumes, has been submitted to the government, but it is unclear when the report will be released publicly. 
Together with ongoing consultations by the eSafety Commissioner on the best way to implement the social media age restrictions, the outcomes of the trial will inform the development of guidance on what constitutes ‘reasonable steps’ under the new law. While the exact methods that platforms will use to verify age remain unclear, a multi-layered approach is expected, combining age assurance methods, rather than one mandated technology. 

Reasonable steps  

Statements by Grant about the developing nature of the requirements have been echoed by Anika Wells, the Minister for Communications, who described the forthcoming rules as “working” rules rather than a fixed solution. She also emphasised that social media platforms should all be collaborating with the eSafety Commission, adapting compliance measures to their unique platform and the proprietary technology in preparation for the implementation of the laws.

Wells highlighted four core “reasonable steps” for platforms: 

  • deactivating existing underage accounts; 
  • preventing new underage account registrations; 
  • identifying and addressing possible workarounds; and 
  • rectifying any errors in age detection or enforcement.

In a recent press conference, Prime Minister Anthony Albanese dismissed concerns from social media platforms about the technical challenges of verifying users, observing that “they can use the capacity which we know that they have” and if they can “identify for political parties…an issue like childcare, identifying women between a particular age, in a particular seat, in a particular demographic, with particular postcodes, then they can help out here too”.

These comments by key government figures and the eSafety Commissioner suggest that platforms’ existing methods are expected to play a significant part in what will be considered reasonable steps, such as inferring user ages based on account behaviour and the age of the account.  Age inference will also be supplemented by additional age verification methods, which may be outlined in eSafety guidelines on ‘reasonable steps’.

While the eSafety Commissioner has said that the guidelines will be formulated before the law takes effect, there is currently no clear timeline on when that information will be released. As stressed by the government, platforms should proactively prepare for the deadline.    

Exempt platforms  

Another key aspect of the law are the rules that specify which platforms are not age restricted and will be exempt from the new obligations.

The recently tabled Online Safety (Age-Restricted Social Media Platforms) Rules 2025, informed by advice from the eSafety Commissioner, specifies the types of online services that the government will not require to be age restricted. These include online gaming, messaging apps and those whose primary purpose is to provide health and education services.

The rules did not include an exemption for YouTube, which the government had previously indicated would be exempt given it educational uses. This followed Grant’s formal advice to the minister on the draft rules, recommending the removal of YouTube from the exemption citing evidence of harms caused to children on the platform, although YouTube Kids may still be exempt from the ban. Grant has recommended avoiding naming specific platforms in the rules due to the fast-changing nature of online services. 

The Commissioner also stated in her National Press Club address that Australia’s bold approach to age assurance has been drawing strong international interest, with other countries now hotly debating these issues and “beating down our door” to learn how the reforms will be implemented. There is no doubt that she will also be watching the effect of the UK Online Safety law, which recently commenced, requiring adult sites as well as social platforms to verify their users are over 18 or face tough penalties. Registered users are being asked to provide ID, selfies to be processed by age estimation software or to connect through other logins such as Google. It has also seen an increase in the use of VPNs and concerns about privacy and data security.

No one disagrees with the objective of improving children’s online safety, but the pathway to effectively implementing restrictions is not straightforward and what is possible will continue to evolve. We will continue to monitor developments and provide further updates on the requirements and impacts for businesses, users and platforms.

We are processing your request. \n Thank you for your patience. An error occurred. This could be due to inactivity on the page - please try again.