Government response to the independent review of the Online Safety Act Update
The Online Safety Act 2021 (Cth) (the "OSA") introduced a regulatory framework requiring online service providers to regulate content and implement guardrails to strengthen online safety for Australians with a particular emphasis on protecting children from online harms. The OSA establishes and gives effect to the Basic Online Safety Expectations for online service providers, enforces a takedown scheme for harmful content, regulates certain types of online content, gives legal effect to industry standards and codes that apply to different categories of online services, and is the instrument through which Australia’s aged-based social media restrictions were introduced.
An independent review was a statutory requirement under the OSA Act and was undertaken by Delia Rickard PSM which examined the OSA and its effectiveness in detail, with the findings set out in the Report of the Statutory Review of the Online Safety Act 2021 provided to the Government in October 2024 (the "Review"). The overarching recommendation of the Review is that the OSA should move to a more systemic and preventative framework through a singular, overarching digital duty of care, which places the onus on online service providers to keep users safe. Having regard to this common thread, the Review proposed 67 recommendations of changes to the OSA framework.
After a lengthy period considering the Review and its recommendations, the Commonwealth Government released its response to the Review on 14 April 2026 (the "Response"). Of the 67 recommendations made in the review, the Government intends to implement, in whole or in part, or further consider, 64 of the recommendations with an immediate focus on those reforms that incentivise harm prevention, including the implementation of a digital duty of care.
A digital duty of care
The key recommendation arising from the Review is the proposal to introduce a technology neutral digital duty of care which places an obligation on providers of online services to proactively and effectively manage the risks associated with the use and misuse of their services. The Government has indicated in its Response that it will amend the OSA to include a statutory obligation for service providers to take reasonable steps and exercise due diligence in maintaining their systems and processes to prevent harm resulting from the use of their services. This seeks to move regulation away from reactively addressing online harms (through take down notices) to one of prevention to stop the harm before it occurs. This would also mark a shift from the current model of simply relying on specific regulations and requirements for different types of services (e.g., social media platforms or app distribution services), although those are likely to remain despite the Review’s critique of the current definitions.
The Government rejected, including fixed thresholds under any new digital duty of the care, and instead noted that the digital duty of care will be risked based, proportionate and will apply to all service providers where there is a risk of harm to Australians. This approach suggests that the majority of online service providers will be subject to the digital duty of care and will be expected to implement this into their systems and services.
Whilst the Government supported the recommendation that repeated non-compliance by service providers in failing to remove content should be considered as a breach of the duty of care and that the maximum civil penalties under the OSA should be increased, the Government did not indicate which penalties or enforcement action may apply in relation to non-compliance with the duty of care in its Response. Enforcement actions for the digital duty of care remain to be determined.
Other key changes
Many of the recommendations for which the Government has indicated support relate to the proposed digital duty of care and place obligations on service providers to implement and maintain systems that provide protection to users. In particular, the Government has indicated it supports:
The introduction of a requirement on those entities with the greatest risk to complete a risk assessment at least once every 12 months and where significant changes are made to their services. These entities should also be required to provide an annual:
- report detailing risk assessments, risk mitigations and how successful they have been; and
- transparency report and publish a summarised version of this on its website.
- Shifting the focus of service providers to the best interests of the child as a primary consideration in assessing and mitigating the risks arising from their services.
- Shortening the timeframe under which eSafety must wait to issue a removal notice upon receiving a complaint about cyber abuse or cyber bullying to 24 hours, and empowering eSafety to waive the waiting period altogether in circumstances of child cyberbullying and adult cyber abuse where no clear complaint mechanism on the service exists.
- Requiring service providers to provide an easily accessible and simple way for users to make complaints and to implement internal complaint handling processes.
- Increase penalties, so that:
- the maximum penalty civil penalty that a court can impose is increased to the greater of 5% of global annual turnover or $50 million; and
- the penalty for non-compliance with removal notices is increased to a maximum of $10 million.
- Empower the regulator to:
- use enforceable undertakings or issue remedial directions to services in relation to all relevant penalty provisions;
- issue removal and link-deletion notices simultaneously under the Online Content Scheme; and
- issue link removal notices for all harmful content under removal schemes.
- Provide the regulator with stronger investigatory powers, including investigating a service provider’s compliance with the duty of care and in relation to reposted material that was previously reported and taken down.
- Requiring service providers to retain certain records for a period of five years, including records of measures taken to comply with their obligations under the OSA and actions taken in response to the regulator’s requests and risk assessments.
Next steps
The timing for the introduction of draft legislation reflecting the recommendations the Government supports remains unclear, as does the extent of any further consultation in relation to those recommendations the Government "supports in principle" or has indicated require further consideration.
There is ongoing scrutiny of the effectiveness of the OSA’s age restricted social media platform regime, which is intended to keep those under the age of 16 off social media platforms, and litigation before the High Court concerning the constitutional validity of that regime.
Online safety remains a highly active policy area for the Government, with a range of complex issues under consideration, so further developments may emerge over an extended period.
Please contact a member of our team if you would like to discuss the steps your organisation can take in relation to the OSA.
White & Case means the international legal practice comprising White & Case LLP, a New York State registered limited liability partnership, White & Case LLP, a limited liability partnership incorporated under English law and all other affiliated partnerships, companies and entities.
This article is prepared for the general information of interested persons. It is not, and does not attempt to be, comprehensive in nature. Due to the general nature of its content, it should not be regarded as legal advice.
© 2026 White & Case LLP