Australia's first tranche of privacy reforms – a deep dive and why they matter
15 October 2024
15 October 2024
"This bill is an important first step in the government's privacy reform agenda, but it will not be the last."
Attorney General, The Hon Mark Dreyfus KC MP (second reading speech)
Tranche 1 of the reforms was introduced to Parliament on 12 September 2024. The bill has already been referred for a Senate Committee inquiry, due to report back on 14 November 2024. Submissions closed on 11 October, and public hearings will be held on 22 October.
With an election likely to be called in the first half of 2025 and many of the tranche 2 reforms subject to consultation and circulation of draft provisions, the likelihood that we will see tranche 2 before the election is quickly diminishing. However, with a comparatively small set of changes in tranche 1, it remains far more likely that some or all of the tranche 1 reforms will be pushed through prior before an election in 2025. Consultations are expected to commence before the end of 2024.
For a helicopter view of the reforms, see Australian Privacy Reforms: A generational change inches closer.
The proposed reforms address 23 proposals that were "agreed" in the Government Response to the Privacy Act Review Report, out of a total of 116 proposals. Key areas covered by the reforms are enhanced regulatory powers, automated decision transparency, cybersecurity uplifts, code-making powers (beginning with a new Children's Online Privacy Code), simpler international data transfers, a new statutory tort for serious invasions of privacy, and criminal offences for doxxing (exposing data in a way that is menacing or harassing).
Most reforms will commence shortly after the bill is passed – with a 6-month delay for the new statutory tort, and a 24-month delay for automated decision-making transparency requirements.
This is only the first part in a broader privacy reform agenda. Tranche 2 will likely cover a much broader spectrum of issues including a new “fair and reasonable” requirement, consent reforms, individual rights, small businesses and employee exemptions, and assessing the privacy impact of high-risk activities.
Read more about the full suite of 116 recommendations from the Privacy Act Review Report in our earlier publication Australia's blueprint for privacy reform– what you need to do today.
The changes seem modest at first blush, but they set up a number of significant themes, signalling a change in the enforcement landscape, the first steps of automated decision-making and artificial intelligence rules, and deliberate signals that ‘reasonable steps’ to meet cybersecurity requirements will be scrutinised.
Privacy Commissioner, Carly Kind, has called out that even before the reforms arrive, the expectations of the Office of Australian Information Commissioner (OAIC) are higher, and the office will be much more enforcement-focused:
“... don't take your foot off the gas, because we're going to be looking to take a more enforcement-based approach to regulation in the interim, even notwithstanding those reforms.”
Combining a more “risk-based and enforcement and education-focused posture” from the OAIC, with a new set of regulator powers, and new avenues for individuals to bring claims, expectations for organisations will only continue to increase as the reform journey continues.
Tranche 1 of the reforms appears to shift the regulatory emphasis – looking to change market behaviours by ensuring appropriate practices, procedures and systems are in place. This is focused on preventing harm to the individual, and can be contrasted with the traditional approach of reactive enforcement after harm (like a data breach) occurs.
Australian Privacy Principle 1.2 requires entities to take reasonable steps to implement practices, procedures and systems to comply with privacy obligations. Failure to do so can be considered a breach of the Australian Privacy Principles in its own right and may lead to a Commissioner-initiated investigation by the OAIC.
The obligations requiring practices, procedures and systems commenced over a decade ago – on 12 March 2014. Entities and agencies covered by the Privacy Act are assumed to have established their practices, procedures and systems over the past decade. The expectation of the regulator will be that you already have demonstrable, defensible, effective and efficient practices, processes and systems are in place.
If designed and implemented properly your practices, procedures, and systems can provide you with a shield from your risks:
If you do not have adequate practices, procedures, and systems, they can become a sword to be used against you. In addition to a possible breach of APP 1.2, as discussed above, companies may face additional regulatory problems meeting other Australian Privacy Principles.
There's no single defined answer. Your design needs to reflect:
Practices, procedures and systems need to be designed, reviewed, and updated regularly to keep pace with the risk environment, laws, and the nature of the activities you undertake.
The simplest question to ask is how ready are you to respond to a regulator or a class action on privacy today? If you are not ready you need to uplift your practices, procedures, and systems to make sure you are ready.
What does good privacy management look like? | |
Green flags | Red flags |
Board has articulated and communicated risk appetite, and closed the loop with strong monitoring, reporting and accountability | Passive, reactive or uninformed board oversight |
Risks and controls continually monitored, with documented results | Informal or undocumented processes Perpetual drafts, or policies never signed off |
Realistic and clear understanding of current maturity, risks to business, organisational risk appetite | Champagne vision on a beer budget |
A top-down, risk-informed approach that surveys risk landscape, and identifies key risks and critical data assets to focus on first | Trying to do everything at once, resulting in analysis paralysis |
Clear thresholds, trigger points and mechanisms for escalation to legal, executive and board oversight | "Fire and forget" policy, without clear escalation |
Strong visibility of key digital assets and data. A focus on high-risk data assets, not just high-value data assets | Sole focus on operational value of data, with a limited understanding of key digital assets |
"To investigate potential privacy breaches in an increasingly complex digital landscape, the Information Commissioner requires modern investigative powers."
Attorney General, The Hon Mark Dreyfus KC MP (second reading speech)
As one of the key changes in the tranche 1 reforms, the OAIC's regulatory toolkit will be expanded, including by adopting a number of the standard regulatory tools available under the Regulatory Powers (Standard Provisions) Act 2014.
The OAIC's new toolkit will include:
The broader regulatory toolkit allows the regulator to fulfil its ambition as a pro-active and risk-focused regulator – it will look to prosecute matters that are going to change practices and set a general deterrence effect across the economy and across markets.
Outside of the new penalty options (discussed below), these new changes include enhancements to existing tools (such as monitoring and investigation, determinations and court orders), as well as adding significant new tools such as regulator-driven code-making powers and the ability to conduct public inquiries into matters relating to privacy, at the direction of the Attorney-General.
The expansion of code-making powers signals a shift to a regulator-driven regime where the OAIC will be able to identify areas of concern and develop a code at the direction of the Attorney-General, potentially requiring entities to comply with additional privacy obligations outside of the legislative process. New provisions for the development of a Childrens Online Privacy Code have also been included. The development of this code will likely operate as a test for the OAIC’s new powers as the Government has promised specific funding for the development of the code.
The OAIC currently has limited enforcement options – it can seek a large civil penalty for a serious or repeated interference with privacy (introduced in the 2022 privacy reforms), or smaller penalties and infringement notices for failures to provide information to the OAIC.
The OAIC can also undertake an investigation and give a determination if it identifies a breach of the Australian Privacy Principles. While entities may agree to pay an amount set out in a determination, if they do not, the OAIC must seek enforcement in the Federal Court.
The tranche 1 reforms will enhance the existing penalty regime and introduce new medium-level and lower-level penalties, as well as infringement notices that may be issued by the OAIC directly, without going to court (although entities will have the option to challenge a penalty notice in the Federal Court). This new penalty framework gives the OAIC significantly more options and brings a greater likelihood of smaller and moderate breaches seeing enforcement action.
For bodies corporate, the revised maximum penalties look like this:
Serious interference with privacy Penalty currently applies to a "serious or repeated" interference. Under the bill, whether an interference is repeated or continuous will be one factor to take into account in determining whether an interference is "serious". | Civil penalty: Greater of:
|
Interference with privacy A new intermediate civil penalty for an interference that is not “serious”. | Civil penalty: $3.3 million |
Specified administrative failures For breaches of specific privacy obligations, such as inadequate privacy policies, direct marketing obligations and statements about notifiable data breaches. | Civil penalty: $330,000 Infringement notice: $19,800 (or $66,000 for publicly listed companies)* |
Failure to give information An increase to existing penalties and infringement notices for failure to provide information by publicly listed companies. | Civil penalty: (Basic) $99,000 Civil penalty: (Multiple) $495,000 Infringement notice: (Basic) $19,800 (or $66,000 for publicly listed companies)* |
* The figures reflect an increase to the value of a Commonwealth penalty unit (to $330) under the Crimes and Other Legislation Amendment (Omnibus No. 1) Bill 2024, which is currently awaiting assent.
The OAIC can bring court proceedings for civil penalties, and the changes include an adjustment to the “serious interference” threshold, to consolidate the test into a single principle (instead of “serious or repeated”), that takes into account various factors. If the “serious interference” threshold is not met, a new mid-range “interference” penalty remains available.
For the civil penalty provisions, whether or not penalties will be multiplied (for example in a data breach scenario) will rely on existing legal principles, depending on application of the court’s discretion in the context. However, for infringement notices, an express provision allows multiplication of the maximum penalty amount.
Infringement notices may be issued by the OAIC without going to court, for minor ‘administrative’ failures where failure to meet the requirement can be easily established.
These notices are intended to allow the OAIC to seek penalties against entities for minor contraventions, without the need to engage in litigation. Infringement notices can be issued for up to $66,000 for publicly listed companies (based on the proposed $330 Commonwealth Penalty Unit), but multiple failures may ‘stack’ on top of one another.
Examples of issues that might be dealt with by an infringement notice include:
"The bill will provide individuals with transparency about the use of their personal information in automated decisions which significantly affect their interests."
Attorney General, The Hon Mark Dreyfus KC MP (second reading speech)
Under the new transparency requirement, organisations will need to identify decisions that significantly affect the rights or interests of an individual, and set out in their privacy policies:
The Privacy Commissioner has specifically called this requirement out: by pairing the automated decision-making requirement with the infringement notice power for administrative failures (which include content of the privacy policy), the OAIC may be able to bring quick action.
Acknowledging the complexity of introducing this level of transparency, the bill proposes a delay of 24 months before these new obligations commence.
Privacy policies must include information about decisions where: | |
An entity has arranged for … | Extends to third party systems and outsourced providers |
… a computer program … | Broadly interpreted – includes pre-programmed rules-based processes, artificial intelligence, machine learning, spreadsheet automation, scoring or ratings, etc |
… to make a decision … … or to do a thing substantially and directly related to making a decision … | Applies to both wholly and partially automated decisions For partially automated, the computer program must be substantially and directly related: it needs to be a key factor in the decision, and directly connected to a decision Applies even if there is a human in the loop actually making the decision |
… that significantly affects rights or interests of an individual … | Concept is impacted by circumstances – eg child or person experiencing vulnerability |
… using personal information | This concept that may expand further with coming reforms, eg expansion of personal information definition |
The framework proposed in the Privacy Act Review Report referred to “substantially automated decisions which have a legal or similarly significant effect on an individual’s rights”, a construct similar to the European GDPR.
The term substantially automated was used to address the risk of "tokenistic" human involvement. While similar, the new approach requires a computer program to be both substantially and directly related to the decision-making process, but also expands the potential effect to rights and interests of the individual, instead of using the “legal or similarly significant effect” formulation.
This new language is closer to the approach explored for use of computer-assisted decisions in the public sector, such as the Canadian Directive on Automated Decision-Making, which extends to systems that support human decision-makers, for example by providing assessments, scores or summaries.
Automated decision rules capture artificial intelligence that uses personal information, as well as other computer assisted decisions. However, the proposed changes can be seen as one of the first cabs off the rank for broader artificial intelligence regulation.
The Privacy Commissioner had flagged that the OAIC will be releasing guidance on the use of artificial intelligence, use of personal information in commercial off-the-shelf artificial intelligence products, and the use of personal information to train AI models.
"… we are moving into a new era in which our expectations of entities are higher ..."
Privacy Commissioner, Carly Kind (Notifiable Data Breaches Report: January to June 2024)
Reasonable steps to protect personal information, and to de-identify and destroy information no longer required, will now include technical and organisational measures. |
|
The bill clarifies that reasonable steps under Australian Privacy Principles 11.1 and 11.2 (relating to security and the destruction or de-identification of personal information) include both technical and organisational measures – an uncontroversial position adopting language also used in the GDPR. However, the impact underscores the Government’s increasing expectations that organisations have sufficient practices, procedures and systems in place to ensure cybersecurity and protect against data breaches.
Additional cyber security measures covered by the bill include an eligible data breach declaration regime, allowing the Attorney-General to make a declaration permitting information sharing to assist in data breach response (for example, sharing information between financial institutions to reduce fraud risks). This new mechanism is similar to a regime under the Telecommunications Regulations 2021, introduced in 2022 in the wake of major cyber incidents.
“This will … reduce costs for business when entering into contracts and agreements with overseas entities.”
Attorney General, The Hon Mark Dreyfus KC MP (second reading speech)
If an overseas recipient is subject to laws or binding scheme prescribed by regulations, then the Australian entity:
Under the Privacy Act, an Australian entity needs to take reasonable steps to ensure an overseas recipient of data complies with Australia's Privacy Principles and can be held accountable for the overseas acts or practices of that overseas recipient.
There are exceptions to these requirements – for example, where specific informed consent is obtained, or where the Australian entity reasonably believes the laws of a foreign country or binding scheme are equivalent to the Australian Privacy Principles.
Under the new bill, the regulations may prescribe a "whitelist" of such countries or binding schemes. Australian entities will no longer need to make their own assessments or carry the risk that their belief is not considered "reasonable", so long as the recipient is bound by a listed law or scheme, and the disclosure meets any relevant conditions. Those conditions could apply to particular entities or types of entities, or to types of information.
For countries or binding schemes not on the "whitelist", the existing regime will apply, and entities will still be able to use existing mechanisms (by making their own assessment, obtaining consents, or being accountable for compliance overseas).
“… providing people with the ability to seek redress through the courts for serious invasions of privacy without being limited to the scope of the Act.”
Individuals may take civil action for: intrusion upon seclusion, or misuse of personal information. In circumstances where:
| “Serious” depends on:
|
Defences to a claim:
| Available remedies:
|
Not linked to Privacy Act compliance Action can be brought whether or not conduct is permitted under or subject to the Privacy Act | No need to establish damage … but harm or potential harm will be an important factor in determining whether the invasion was serious |
Misuse of information includes any collection, use or disclosure about an individual, and intrusions on seclusion will extend, for example, to physically entering private space, or listening to or recording private activities.
The Privacy Commissioner has expressed support for the statutory tort as a “different route” for individuals that does not rely on the complaints process (which requires significant resources from the OAIC). This gives the OAIC the opportunity to decide to leave certain matters to an individual bringing a serious invasion claim – selectively using its new powers in the bill to intervene in proceedings or assist the court as amicus curiae.
Some of the other changes that are included in the bill include the following:
Authors: Geoff McGrath, Partner; Leon Franklin, Director - Consulting, Risk Advisory; Andrew Hilton, Expertise Counsel; Michael Turner, Executive, Risk Advisory; Thomas Suters, Graduate and Michelle Lee, Paralegal.
We draw on Ashurst's combined legal and risk advisory expertise to help organisations keep pace with the evolving Privacy Act reforms and the actions they can take to position themselves for success.
Learn more about privacy reform in AustraliaThis publication is a joint publication from Ashurst Australia and Ashurst Risk Advisory Pty Ltd, which are part of the Ashurst Group.
The Ashurst Group comprises Ashurst LLP, Ashurst Australia and their respective affiliates (including independent local partnerships, companies or other entities) which are authorised to use the name "Ashurst" or describe themselves as being affiliated with Ashurst. Some members of the Ashurst Group are limited liability entities.
The services provided by Ashurst Risk Advisory Pty Ltd do not constitute legal services or legal advice, and are not provided by Australian legal practitioners in that capacity. The laws and regulations which govern the provision of legal services in the relevant jurisdiction do not apply to the provision of non-legal services.
For more information about the Ashurst Group, which Ashurst Group entity operates in a particular country and the services offered, please visit www.ashurst.com
This material is current as at 15 October 2024 but does not take into account any developments to the law after that date. It is not intended to be a comprehensive review of all developments in the law and in practice, or to cover all aspects of those referred to, and does not constitute legal advice. The information provided is general in nature, and does not take into account and is not intended to apply to any specific issues or circumstances. Readers should take independent legal advice. No part of this publication may be reproduced by any process without prior written permission from Ashurst. While we use reasonable skill and care in the preparation of this material, we accept no liability for use of and reliance upon it by any person.
The information provided is not intended to be a comprehensive review of all developments in the law and practice, or to cover all aspects of those referred to.
Readers should take legal advice before applying it to specific issues or transactions.