SOCIAL MEDIA AGE BAN

The Australian Government’s new Social Media Minimum Age (SMMA) law, which excludes all Australians aged 15 and under from most social media services, comes into effect on December 10, 2025.
This briefing will explain what is and is not likely to be captured by the legislation, and what schools should consider doing in response.
The eSafety Commissioner has released an information hub with answers to many FAQs.
MyChristianSchool has also published a pastoral guide to navigating the social media age ban.
Commissioner Inman Grant has told Senators that her office will release resources, including seminars, webinars, how-to guides and conversation starters, from October 9 to help parents and teenagers prepare.
The social media age ban regulations specifically exclude online social media services that have the sole or primary purpose of education and health services.
Legislation Overview
This is a deferred commencement of 12 months following Royal Assent of the late-2024 amendments, and is related to consequential ministerial regulation, including Online Safety Rules 2025 and other Online Safety Instrument(s) under the powers of the eSafety Commissioner.
The Social Media Minimum Age (SMMA) law targets popular platforms where the primary or significant purpose is to enable social interaction (between more than two end-users), such as Facebook, Instagram, TikTok, Snapchat, X (formerly Twitter), and YouTube. There a broad range of educational and other platforms that are likely to be excluded, which will be discussed later.
Starting from December 10, 2025, social media platforms will be required to implement robust age verification technologies before allowing users to create accounts.
- This law mandates that social media platforms take reasonable steps to prevent Australians under 16 years old from creating or maintaining social media accounts.
- According to the e-Safety Commissioner, the conditions for age restriction are:
o the sole purpose, or a significant purpose, of the service is to enable online social interaction between two or more end-users
o the service allows end-users to link to, or interact with, some or all of the other end-users
o the service allows end-users to post material on the service.
o Online gaming and standalone messaging apps are among a number of types of services that have been excluded under the legislative rules. However, messaging services that have social-media style features which allow users to interact in other ways apart from messaging may be included in the age restrictions, as well as messaging features accessed through age-restricted social media accounts. - These ‘age assurance’ technologies, as experimented with in the Government’s Technology Trial, may include facial recognition software, analysis of behavioural data, or identity checks, including the recently legislated digital ID verification.
- All users, regardless of age, will be required to submit ‘assurance of age’ if the platform cannot confirm the person’s age using “reasonable steps” to prevent circumvention of the ban.
- Each social media company will have the discretion to select the age verification methods that are most appropriate and effective for their platform.
- Additionally, platforms must identify and deactivate existing accounts belonging to users under the age of 16.
- An accessible review process will also be put in place to allow users who believe they have been mistakenly flagged as underage to challenge such decisions.
- The bill includes large penalties for the captured online platforms for non-compliance, with fines of up to AUD $50 million for social media companies that fail to enforce age restrictions effectively.
Carers and institutions who supply IT services not regulated by the ban
The Bill explanatory memorandum states: “The onus is on platforms to introduce systems and processes that can be demonstrated to ensure that people under the minimum age cannot create and hold a social media account. It is not the intention that the Bill would punish a platform for individual instances where young people circumvent any reasonably appropriate measures put in place by the platform – however, a systemic failure to take action to limit such circumventions could give rise to a breach. As the onus is on platforms, there are no penalties for age-restricted users who may gain access to an age-restricted social media platform, or for their parents or carers.”
The government and the Office of the eSafety Commissioner have been clear that there will be no penalties or pressure if there is a lack of enforcement, regulation or compliance by carers or institutions who care for children.
All the burden of compliance and penalties rests upon the online platforms that facilitate online social interaction.
Rules – Excluded Services
On 29 July 2025, the Minister for Communications made the Online Safety (Age-Restricted Social Media Platforms) Rules 2025 (the Rules), which specify the types of online services not covered by the SMMA. They are:
• Messaging, email, voice calling or video calling services
• Online games
• Services that primarily function to enable information about products or services
• Professional networking and professional development services
• Education and health services
In detail the exclusions relevant to schools and educational institutions are:
• services that have the sole or primary purpose of supporting the education of users
• services that have the sole or primary purpose of supporting the health of users
• services that have the sole or significant purpose of facilitating communication between educational institutions and students or student families
Schools that run an online platform that has the sole or significant purpose of educational content sharing (i.e. it is a social media platform) between staff, parents and/or students are intended to be excluded from the social media age ban, so long as the platform is used for that primary purpose.
To ensure that a school’s online platforms are compatible with these exclusions, the relevant platform procedures, guidelines and legal disclaimers may need be updated to reflect the language used in the Online Safety Rules 2025.
This may provide clarity to school boards, legal advisors and the parent community that the school is abiding by the legislation to the best of its ability.
According to the Federal government, the Rules are intended to retain children’s access to services that are, “known to pose fewer risks of online harms, particularly arising from addiction, problematic use, unhealthy social comparisons, and exposure to content that is inappropriate for children”.
The eSafety Commissioner, as the delegated regulator, does not have a formal role, according to the Online Safety Act, in declaring which services are age-restricted social media platforms.
However, platforms it believes to be age-restricted will be informed of this by eSafety, to help them understand their legal obligations under the Online Safety Act.
On December 4, the Government announced that Reddit and Kick would be covered by the social media age ban, in addition to Facebook, Instagram, Snapchat, TikTok, YouTube and X (formerly Twitter).
These platforms must enforce age restrictions to exclude users under 16, reflecting a move to protect young people from harmful online pressures, including mental health risks and exposure to inappropriate content.
Communications Minister Anika Wells has stated, “I have met with major social media platforms in the past month so they understand there is no excuse for failure in implementing this law.”
While eight platforms were found by eSafety to require age-restrictions, their “assessments will be ongoing” and they could be removed from the list, which will be “dynamic”, she said.
Discord, Roblox, Steam, YouTube Kids, Lego Play and WhatsApp have been informed by Australian authorities that they are not subject to age restrictions.
YouTube partially included in the ban
Despite initially being exempt from the legislation, YouTube account holders aged under 16 will be deactivated and unable to access their account from 10 December 2025.
Communications Minister Anika Wells declared in August 2025, “We now know that YouTube is where a significant number of children are experiencing harm online.”
Users under age 16 will be able to use YouTube in a logged-out “guest mode,” but all interactive features will be disabled. YouTube Kids is not affected, and teachers can continue to use YouTube for educational purposes in the classroom.
If the age assurance method performs correctly, YouTube users aged 15 or under will not be able to:
- Create an account
- Subscribe to channels
- Leave comments
- Receive personalised recommendations
- Upload content
The ban does not affect passive, supervised viewing of YouTube content in a learning context.
This means schools will need to review how guidelines and user instructions are communicated regarding student access on personal and shared devices.
Schools are not required to have a role in enforcing the regulations on-campus or off-campus, but all users will need instruction in how the new social media ban is intended to function.
Risks associated with the age assurance technologies
The Australian Government’s Age Assurance Technology Trial (AAATT), an AUD$6.5 million project, was aimed at evaluating the feasibility of age assurance technologies to support legislation banning under-16s from accessing social media.
According to psychologist Jasmine Fardouly, “It remains unclear if platforms can enforce the ban effectively and without privacy concerns. Self-reporting age has proven unreliable, with 84% of Australians aged 8–12 years currently using social media despite a minimum age of 13. Media and online gaming bans in countries such as China, South Korea, and France have been largely ineffective, as children bypass restrictions using virtual private networks and other methods.”
The critical technical review from the Freespeech Union has declared that the AAATT trial, as outlined by its interim report, failed to find any commercially viable system that meets real-world performance requirements.
Key concerns include:
1. Poor Real-World Recognition Performance: Some systems reportedly permit a high percentage of underage users (e.g., 30-45% of 10-13 year olds) to bypass age gates set at 16+, undermining the system’s protective purpose.
- According to an SMH report (9/10/25): “Studies by the United States’ National Institute of Standards and Technology show algorithms “fail significantly when attempting to differentiate minors” of various ages. Even the best age-estimation software, Yoti, has an average error of one year. At worst, some software mistakes an age by 3.1 years on average, meaning 16-year-olds could be assessed anywhere from 13 to 19.”
2. Bias and Discrimination: The trial performed inadequately for Aboriginal or First Nations people and people of colour, with systems often overestimating ages, which could lead to discriminatory impacts. Disability inclusion was not properly investigated, with facial estimation systems likely to fail people with certain disabilities.
3. Privacy and Security Deficiencies: Privacy claims rely heavily on vendor self-declarations, with no rigorous penetration testing or independent security validations conducted. Data retention and handling lacked sufficient transparency.
- John Pane, chair of Electronic Frontiers Australia and a former member of the trial’s advisory board, who resigned in August, has noted the trial had chosen some platforms that were, in his opinion, building “a surveillance-level response to the entire user population”.
4. Usability Concerns: Many users found the systems frustrating or difficult to use, with long wait times for results (averaging between half a minute to 3 minutes) that could discourage compliance. This poor usability compromises the practicality of these systems on a wide scale.
5. Methodological and Ethical Failings: The trial lacked clear, transparent methodology and did not fully comply with Australia’s National Statement on Ethical Conduct in Human Research. The composition and governance of the trial’s ethics committee fell short of best practice standards.
6. Questionable Readiness and Deployment Viability: Of the systems judged most mature (TRL 7-9), none demonstrated consistent real-world reliability or fairness necessary to be considered a “reasonable step” for social media platforms or services to adopt under the law.
Additional critical voices highlight risks of significant penalties for service providers mandated to implement technologies that are not yet proven secure, effective, or inclusive.
There are warnings that the AAATT’s reliance by regulators and policymakers could lead to unfair or disproportionate restrictions, with potential harm outweighing benefits, especially to vulnerable populations.
Despite government claims of feasibility, the Freespeech Union report warns the trial’s results are scientifically unreliable and should not guide policy. The group instead advocates for proven child safety measures like parental controls rather than premature adoption of faulty age assurance technologies.
Suggested responses for School Administrators and Principals
School leaders in independent and Christian schools will need to revise their user polices and guidelines in response to the legislation.
- Policy Adjustment and Communication: Schools will need to revise their internet and social media use policies, to provide clarity to users regarding the operation and effects of the minimum age limitations. Principals should lead communication efforts to ensure students, parents, and staff understand the regulation’s implications and the school’s role.
- Education and Awareness: Administrators should coordinate awareness programs that educate students about restrictions imposed on early social media use. Each school will need to decide if that includes integrating digital literacy and online safety education that aligns with the eSafety Commissioner’s guidelines.
- Student Support Measures: Given the law’s focus on protecting mental health, schools may consider enhancing student support services, monitoring wellbeing and intervening in cases where online and social media harms are identified or suspected. School leaders may work with counselors and mental health experts to adapt their pastoral care frameworks accordingly.
Policy Makers’ Roles
For policy makers within these schools and education sectors, implementation may include the following areas.
1. Compliance Framework Development
Establish policies and frameworks that ensure the school’s digital environment is compatible with the legislation, including:
- identifying and refining education-based online social media platforms created, managed or owned by your school or educational institution, to ensure is it compatible with the stated exclusions in the Online Safety Rules 2025;
- communicating the legislation’s claimed benefits and potential limitations; and
- changing teacher procedures and guidelines, and suggested resources and tools used by school-managed platforms or online interactions where non-educational-based social media tools are used educationally.
2. Collaboration with Parents and Community
Schools may choose to directly engage parents and the wider community in:
- communicating the rationale and the school’s handling of student’s access to educational and non
- educational online social media platforms, and differences in access for students; \managing expectations for parents and staff; and
- addressing questions about social media usage at home and school.
3. Monitoring and Reporting
Schools should develop:
- monitoring mechanisms aligned with governmental guidelines;
- regular re-evaluation of policies to ensure they are effective and compliant; and
- appropriate mechanisms to report any issues (like compliance, potential liability, technology issues and limitations) related to underage social media access to relevant authorities or the eSafety Commissioner.
Practical Implementation Considerations
1. Technology and Age Verification
Though the law places the onus on social media platforms, schools may need to:
- support students and parents in navigating consent and verification processes imposed by the platforms themselves; and
- provide guidance on digital identity and privacy considerations as required by the legislation.
2. Support for Transition
Some students under 16 may need support regarding:
- losing access to existing social media accounts; and
- strategies to support these students in safely transitioning away from underage use without social isolation or distress.
3. Balancing Access and Protection
Especially in Christian and independent schools, where community values and child wellbeing are paramount, schools may need to emphasise their policies regarding protecting students spiritual, psychological, and social welfare while supporting their educational and social development.
The Social Media Age Ban legislation means that school administrators, principals, and policymakers will need to pre-emptively and collaboratively adapt policies, educate their communities, provide ongoing student support, and ensure compliance with both legal requirements and community values.
By doing so, schools will help safeguard student wellbeing while fostering digital literacy and responsible social media engagement aligned with the new national standards.
Finally, we encourage you to also read the pastoral guide to navigating the social media age ban from MyChristianSchool.
