In a move to safeguard children and reshape the regulatory landscape of digital access, Australia has enacted the Online Safety Amendment (Social Media Minimum Age) Act 2024, amending the Online Safety Act 2021. This legislation introduces a minimum age of 16 for social media use, supported by a comprehensive framework of compliance obligations, privacy protections, civil penalties, and regulatory oversight.
-
Assented on: 10-12-2024
-
Commenced on: 11-12-2024
-
Enforcement of penalties begins:10-12-2025
The Foundation and Timeline:
-
In 2023, rising public concern over the effects of social media on children around mental health, cyberbullying, and addictive platform design, prompted the Australian Government to initiate consultations on age-based restrictions. These discussions laid the groundwork for drafting the amendment to the Online Safety Act 2021.
-
The proposed legislation, titled the Online Safety Amendment (Social Media Minimum Age) Bill 2024, was introduced in Parliament on 21-11-2024 and swiftly progressed through both Houses.
-
It received Royal Assent on 10-12-2024 and commenced the following day 11-12-2024. Recognizing the operational demands on platforms, the government included a delayed enforcement clause under Section 63E, allowing up to 12 months for compliance.
-
On 29-7-2025, the Minister for Communications, Anika Wells, issued the Online Safety (Day of Effect of Social Media Minimum Age) Instrument 20251, specifying 10-12-2025 as the date Section 63D would take effect.
-
The Minister also released the legislative rules clarifying which types of services are excluded from the age restrictions.
The Under 16 Social Media Ban:
-
The Act states social media platforms are required to take reasonable steps to prevent children under the age of 16 from maintaining social media accounts.
-
Section 5 introduces two critical definitions:
○ Age-restricted user: Children under 16
○ Age-restricted social media platform: A service primarily enabling online social interaction, posting, and linking between users.
-
Section 63C outlines the criteria for determining whether a platform qualifies as age restricted. It excludes platforms used solely for business interaction.
-
The Commissioner’s duties include:
○ Coordinating online safety efforts across government
○ Advising the Minister on platform classification
○ Promoting and formulating compliance guidelines
-
Under Sections 63DA and 63DB, providers bear the evidential burden of proving that no reasonable steps could be taken to comply, an important legal step that balances enforcement with operational feasibility.
The Penalties:
-
Section 63D imposes a civil penalty on providers who fail to take reasonable steps to prevent age-restricted users from maintaining accounts.
Penalty: 30,000 penalty units
-
Section 63DA prohibits the collection of certain types of personal information for age verification, unless explicitly permitted by legislative rules. This includes sensitive data such as government-issued identification.
Penalty: 30,000 penalty units
-
Section 63DB further restricts the use of Digital ID services and government-issued identification materials for compliance purposes, unless alternative, reasonable verification methods are provided.
Penalty: 30,000 penalty units
-
Section 63H mandates compliance with information notices issued under Section 63G. Failure to comply attracts a civil penalty.
Penalty: 500 penalty units
-
Sections 63DA and 63DB, the provider bears the evidential burden of proving that no reasonable steps could be taken to comply. This legal nuance ensures accountability while recognizing operational constraints.
The Privacy Protection:
-
Section 63F integrates privacy safeguards aligned with the Privacy Act 1988:
○ Improper use or disclosure of personal information collected for age verification is deemed an interference with privacy.
○ Entities must destroy such data after its intended use; failure to do so constitutes a privacy breach.
-
Division 4 empowers the Commissioner to compel platforms to provide compliance-related information
○ Section 63G: Written notices may be issued to providers believed to hold relevant information
○ Section 63H: Mandates compliance, with penalties for failure to respond
-
These provisions ensure transparency and enable proactive regulatory oversight.
The Public Disclosure:
-
Sections 63J and 63K authorize the Commissioner or Information Commissioner to issue public statements if a platform violates age restriction or privacy rules.
-
These statements will be published on their respective websites.
The Additional Amendment:
-
The Amendment Act strengthens regulatory enforcement through changes to Sections 143 and 146 of the Online Safety Act 2021, increasing penalties for non-compliance with industry codes and standards from 500 to 30,000 units.
-
Certain procedural protections previously available under the Regulatory Powers (Standard Provisions) Act 2014 have been removed, streamlining the regulator’s ability to impose penalties.
-
Section 222A grants the Information Commissioner immunity from liability for actions taken in good faith under the Act, reinforcing the independence and integrity of regulatory oversight.
-
To ensure accountability and adaptability, Section 239B mandates an independent review of Part 4A within 2 years of its enforcement date. This review will evaluate the adequacy of privacy protections and other operational aspects, with a written report to be tabled in both Houses of Parliament.
-
Schedule 2 of the Age Discrimination Act 2004 has been updated to include references to Part 4A, ensuring that the new social media age framework aligns with broader anti-discrimination legal standards.

