Legal development

Data Bytes 54

Triangular Colorbond profiles

    Welcome back to the first edition of Data Bytes 2025. We are fresh from our new year break and looking forward to keeping you updated throughout 2025 with key data law updates across the UK and Europe

    Particularly note worthy updates in this bumper "two for one" issue covering December and January are the UK Information Commissioner's Office (the "ICO") announcement it will be reviewing the cookie usage in the top 1000 websites which represents an expansion on its previous reviews of the top 100 and 200 UK websites.  The ICO also noted its review would be expanding beyond websites to assessing the compliance of apps and connected devices such as TVs. 

    In our European section, you will find summaries of two key publications from the EDPB: (i) its consultation on new guidelines on the pseudonymisation of personal data under the General Data Protection Regulation (GDPR); and (ii) its long awaited opinion on data protection in artificial intelligence (AI) models.

    Keep scrolling to our spotlight section where you will find our predictions on where cyber, data, AI and board room priorities will overlap in 2025.  A timely read, as on 31 January, the UK government published its response to the call for views on a code of practice for cyber governance. The proposed Cyber Governance Code of Practice sets out how company boards and senior leaders can build resilience to a wide range of cyber risks across their organisation. Our Ashurst legal and risk advisory teams consider why cyber readiness and AI awareness and governance should be on board agendas over the coming 12 months.

    UK Developments

    1)  ICO publishes outcomes report on its generative AI consultation series 

    On 12 December 2024, the ICO published the outcomes report to its five-part consultation series on the development and use of generative AI models, which covered topics relating to lawful basis, controllership, accuracy and individual rights in the context of generative AI. 

    In the report, the ICO refines its positions on: 

    • Legitimate interests lawful basis for web scraping to train generative AI models - acknowledging consultation responses that data collection methods other than web scraping exist, the ICO confirms that it is for model developers to demonstrate the necessity and evidence why alternatives to web scraping are not suitable. 
    • Engineering individual rights into generative AI models - the ICO confirms that organisations acting as controllers must design and build systems that implement the data protection principles effectively and integrate necessary safeguards into the processing 

    The ICO also highlights an overall need for better transparency by generative AI developers and warns that it will now focus on “organisations that are not doing enough”.

    2) AI Action Plan

    On 12 January 2025, the Prime Minister set out the blueprint to unleash AI across the UK by accepting all 50 recommendations set out in the AI Opportunities Action Plan.

    The blueprint is separated into three main pillars: laying the foundations for AI to flourish in the UK; boosting adoption across public and private sectors; and keeping [the UK] ahead of the pack.

    The plan makes a series of recommendations with respect to the regulation of AI including. 

    • Actions of the Government: The plan re-emphasizes the need for "well designed and implemented" regulation of AI that is complemented by effective assurance tools. In particular, the government is to work with regulators in accelerating AI in priority sectors and implement pro-innovation initiatives, such as regulatory sandboxes.
    • Actions of Regulators: The Government will look into mandating that regulators, on an annual basis, be required to publish how they have enabled innovation and growth driven by AI in their sector.
    • Changes to Legislation: The Government shall also look into amending the current regime with respect to text and data mining. The Government has recognised that cultivating strong data sets is imperative to promoting AI development, and will look to make amendments to the current regime to help facilitate the training of AI models. 

    The timeline for delivery of each recommendation varies, with the Government stated to outline its plan of action for most recommendations by the end of 2025.

    3) Prismall v Google UK decision confirms challenge of representative actions 

    The Court of Appeal handed down on 11 December 2024 their judgment in the case of Prismall v Google UK. The case related to a representative action brought to the High Court by Mr Prismall claiming damages, on behalf of 1.6 million claimants, for the misuse of private information, relating to medical records. 

    The High Court concluded that it was not possible to establish a reasonable expectation of privacy for all class members, and the Court of Appeal agreed with this. It was specifically held that the claimants could not have reasonably expected a right to privacy in respect of their personal data. The data held by Google was held securely, not accessed by anyone and the information concerned was already in the public domain.

    The decision reinforces the challenges of bringing successful representative actions in the UK courts in relation to the misuse of data. 

    4) ICO launches 2025 strategy to tackle cookie compliance and online tracking

    The ICO announced on 23 January 2025 its 2025 strategy to ensure organisations using online tracking give people clear choices and confidence in how their information.  As part of this strategy, the ICO will be reviewing the cookie usage in top 1000 websites which represents an expansion on its previous reviews of the top 100 and 200 UK websites.  The ICO also noted its review would be expanding beyond websites to assessing the compliance of apps and connected devices such as TVs.  

    Organisations are now on notice with regards to the ICO's expectations concerning user choice and transparency in connection with cookies. We are yet to see formal enforcement action aside from reprimands in relation to cookie compliance but this could change during the course of 2025.  

    The ICO also released at the same time its draft guidance for organisations considering implementing consent or pay models. The guidance clarifies how organisations can deploy ‘consent or pay’ models to give users meaningful control while supporting their economic viability and includes a set of factors for organisations to assess their models against to demonstrate people can freely give their consent.

    5) UK Government opens mandatory ransomware reporting consultation 

    The Government launched on 14 January a consultation on measures to reduce the threat posed by ransomware, with three proposed groups of measures:

    1. a ban on ransomware payments for all public sector bodies (including local government) and owners and operators of critical national infrastructure, that are regulated, or that have competent authorities;
    2. a ransomware payment prevention regime which would require any victim  of ransomware (organisations and/or individuals not covered by the proposed ban set out in Proposal 1) to engage with authorities and report their intention to make a ransomware payment before paying any money; and
    3. a ransomware incident reporting regime, potentially including a threshold-based mandatory reporting requirement for suspected victims of ransomware.

    Paying a ransom is not illegal in the UK, although those paying ransoms are at risk of breaching proceeds of crime, terrorism and anti money laundering legislation depending on the traceability of the payment. The Government has discouraged the payment of ransoms for quite some time. Other governments have recently introduced legislation to this effect. Notably, the Australian government passed the Cyber Security Act on 27 November 2024, which has established a mandatory requirement to report ransomware payments with 72 hours of making payment or becoming aware of a payment that has been made. 

    The consultation closes on 8 April 2025.

    6) ICO responds to Google's policy change on using ad tech fingerprinting 

    On 19 December the ICO announced its response to Google's decision to change its policy, as of 16 February 2025, to no longer prohibit organisations using Google’s advertising products to deploy fingerprinting technologies. Fingerprinting is a form of passive online tracking aiming to create a profile of any users online activity by collecting information about system configurations without storing any data on devices.

    As fingerprinting is much harder to detect or control than other forms of online tracking, the ICO believes it is not a fair means of tracking users online. As such, the ICO's general response to Google's policy change is a negative one, deeming the change as irresponsible. The ICO noted it was especially disappointed to see Google go back on their previous opinion that fingerprinting "subverts user choice and is wrong".

    Given Google's influence and market presence, it is clear the ICO is now concerned that this policy change could have a knock on effect potentially giving rise to similar changes across the industry. The ICO took the opportunity to reiterate business' obligations under data protection law, including Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR), and give users fair choices. Additionally, in a related update, the ICO released for consultation revised cookie guidance which addresses how data protection law and PECR applies to tracking technologies like fingerprinting, see here. 

    7) Ashley vs HMRC decision provides guidance on DSAR responses 

    The high court ruled on 24 January 2024 that HMRC had adopted an "unduly narrow" concept of personal data in its response to data subject rights request (DSAR) by business man Mike Ashley for all information the authority held about him in the context of tax dispute.

    The decision provides a comprehensive review of the relevant guidance and caselaw providing detail on how the concept of personal data needs to be applied in practice by organisations when responding to DSARs.

    8) UK Government responds to call for reviews on a code of practice for cyber governance 

    The UK Government published on 31 January 2025 its response to the call for views on a code of practice for cyber governance. The proposed Cyber Governance Code of Practice sets out how company boards and senior leaders can build resilience to a wide range of cyber risks across their organisation. In its response, the government warns that despite the massive disruptions cyber incidents can cause, boards and senior leaders often struggle to engage in cyber issues due to a lack of understanding, training, or time.  The updated version of the code is due to be published in early 2025.  

    EU developments 

    EU Wide

    1) The European Data Protection Board (EDPB) opinion : addressing certain data protection aspects related to the processing of personal data in the context of AI models, in response to a request made by the Irish Data Protection Commission (as permitted by Article 64(2) of the GDPR)

    On 17 December 2024, the EDPB issued an opinion on data protection in AI models, in response to a request from the Irish Data Protection Commission. 

    The EDPB highlighted that AI models trained on personal data generally cannot be considered as anonymous, requiring a case-by-case assessment of the measures taken to limit data identifiability and extraction. 

    Regarding the use of legitimate interest as a legal basis, the EDPB stressed the importance of a three-step test: 

    • Identification of the legitimate interest
    • Necessity of the processing
    • Respect for individuals' rights.

    Finally, the EDPB pointed out that supervisory authorities have discretionary powers to assess infringements and adapt their actions accordingly.

    The opinion can be found here. 

    2) The General Court of the European Union (EGC) orders the Commission to pay EUR 400 non-material damage compensation for breaching data transfer rules

    On 8 January 2025, the EGC ordered the Commission to pay EUR 400 in non-material damage to an European data subject ("Mr. Bindl") for having infringed rules on international data transfers. In that context, the EGC has also ruled that dynamic IP addresses constitute personal data and data subjects cannot claim damages for violations of data transfer rules when they triggered a certain outcome (namely the transfer of their personal data to a third country) which was directly caused by their own conduct, in Mr. Bindl's case by using a VPN.

    In the given case, Mr. Bindl had visited a EU Commission website several times. On one visit, he registered for an event using the "EU Login", the Commission’s user authentication service. He selected the option of signing in trough his Facebook account. With this action, his IP address was transmitted to Meta Platforms. On other visits to the EU website, Mr. Bindl used a VPN to obfuscate his real location. For this reason the EU-Commission's service provider Amazon CloudFront assumed that Mr. Bindl was located in the US and stored his IP address on US servers in accordance with its general practice.

    The court held that "even ‘dynamic’ IP addresses – which by nature change over time - correspond to a precise identity at a given point in time, which, in this case, coincides with the point in time at which [Mr. Bindl's visit of the EU] website took place". Further, the EGC held that, by displaying the ‘Sign in with Facebook’ hyperlink on the EU Login website, the Commission had created the conditions for a transfer of Mr. Bindl’s personal data to a third country without complying with the data transfer rules as applying to the Commission and other institutions of the EU (Article 46 of Regulation 2018/1725). At the time of the website visits, there was no adequacy decision for the data transfer to the United States in place. The Commission also did not rely on appropriate safeguards, in particular standard contractual clauses. Thus, "the displaying of the ‘Sign in with Facebook’ hyperlink on the EU Login website is governed by the general terms and conditions of the Facebook platform". The court ordered a non-material damage in the amount of EUR 400 for the data transfer to the US via Facebook. The EGC ruled that Mr. Bindl's damage was based on the uncertainty as regards the processing of his personal data. The court further held that storing the IP address on a US CloudFront server while using a VPN for the visit of the EU Commission's website did not justify a claim for damages because the data transfer to the US was initiated by Mr. Bindl's behaviour.

    The EGC's decision can assist in interpreting the GDPR as the Regulation 2018/2175 and the GDPR provide similar rules. The decision of the ECG is not yet final and binding, as either party can appeal it at the European Court of Justice (CJEU) within two months.

    3) CJEU Decision of 9 January 2025 - A customer's gender identity is not necessary for the purchase of a rail ticket

    On January 9, 2025, the CJEU ruled that the systematic collection of customers title (“Mr“ or “Mrs”) by SNCF Connect (French national railway company) when purchasing tickets violates the GDPR's data minimisation principle.

    This practice, aimed at commercial personalisation, is neither necessary for the performance of the contract nor justified by a legitimate interest. The decision underscores the importance of clear information to customers and strict limits on data necessity to avoid discrimination based on gender identity.

    The press release can be found here.

    4) EDPB provides clarification on pseudonymisation (Guideline 01/2025)

    On 16 January 2025, the EDPB has put forward for consultation new guidelines on the pseudonymisation of personal data under the GDPR. The guidelines clarify the use and benefits of pseudonymisation, outlining its definition, objectives, and implementation strategies for controllers and processors. 

    By proper pseudonymisation a data controller (or data processor) processes personal data in such a manner that it can no longer be attributed to a specific data subject without the use of additional information, which must be kept separately and protected by technical and organisational measures. The guidelines emphasize that, on the one hand, pseudonymisation can significantly reduce risks to data subjects by preventing the attribution of personal data to natural persons during processing and in the event of unauthorized access. On the other hand, pseudonymisation enables controllers to analyse data and merge different records relating to the same person while ensuring compliance with data protection obligations.

    The EDPB guidelines provide detailed instructions on how to implement pseudonymisation effectively, including the selection of appropriate pseudonymisation techniques and the management of additional information. The guidelines highlight the importance of defining the context in which pseudonymisation is to preclude the attribution of data to specific data subjects ("pseudonymisation domain"), which encompasses the environment in which data is processed and the entities involved. The guidelines also address the implications of pseudonymisation for data subject rights, such as access, rectification, and erasure, and the need for transparency in data processing activities. By adopting pseudonymisation, businesses can enhance data protection by design and by default, ensuring a level of security appropriate to the risk. The guidelines underscore that while pseudonymisation is a powerful tool, it must be complemented by other measures to fully meet GDPR requirements. The EU Commission has invited to provide feedback on its draft guidelines until 28 February 2025. Ashurst is involved in the stakeholder initiative "Plattform Industrie 4.0" that is preparing a submission.

    5) EDPB report : the Coordinated Enforcement Framework 2024 to identify the challenges to the full implementation of the right of access

    On the 16 January, the EDPB released a report on the implementation of the right of access by data controllers, based on coordinated national actions conducted in 2024 under the Coordinated Enforcement Framework (CEF). The report highlights challenges, positive findings, and recommendations to improve compliance with this fundamental data subject right.

    Points of attention

     Observations

     Recommendations

    Lack of awareness about the scope of access to be provided

    Controllers should pre-assess the scope of Article 15 GDPR to identify which information qualifies as personal data and determine where to verify this data upon receiving an access request. The EDPB recommends using the record of processing activities to locate personal data accurately. This record should be regularly updated to reflect new processing activities, IT systems, processors, or organizational changes.

    Indefinite, excessive or inconsistent retention periods relating to access requests

    Controllers should fix a retention period for access request communication based on objective criteria and document their reasoning.
    Statutory retention periods for other types of documents or records should not be applied by default.

    Controllers should store access request communications separately from other data subject information with different retention periods or access and role management rights.

    Lack of documented internal procedures

    The EDPB could issue further guidance regarding best practices for documenting compliance with Article 15 requests.

    Barriers to the facilitation of the right of access

    Controllers should be prepared to process access requests, regardless of whether they are submitted via a dedicated data protection channel. Each request should be assessed individually to determine if additional identification or authentication of the data subject is necessary.

    Excessive interpretation of the possibility to ask for specification of access requests

    Controllers should assess each access request on a case-by-case basis to verify whether the conditions of recital 63 s. 7 GDPR are met, and inform the data subject about any processing activities potentially concerning them in their request for specification.

    Provision of insufficiently detailed or tailored information to data subjects

    Controllers are responsible for accurately recording to which entities precisely they disclose personal data and where these recipients are located, to be able to name the actual recipients.

    The full report is available here.

    6) EDPB issues a position paper on the interplay of data protection and competition law

    On 16 January 2025, the EDPB has published a position paper addressing the interplay between data protection and competition law. The enhanced cooperation between data protection and competition authorities is necessary. This way, competition and data protection authorities can safeguard data subject rights and ensure coherent regulatory enforcement in the evolving digital economy. The EDPB highlights that while data protection and competition law are distinct legal frameworks with different objectives, they share common goals such as protecting individuals and promoting consumer welfare. The paper references the CJEU's judgment in Meta v Bundeskartellamt Case C-252/21 emphasizing that "cooperation between data protection and competition authorities is, in some cases, mandatory and not optional".

    The EDPB recommends that dedicated teams within authorities will be established to coordinate tasks and act as single point of contact for other authorities. The EDPB suggests that a competitive market can foster privacy-friendly options as companies must reflect the consumers interests to be successful. Further, it demands policymakers to increase cooperation among regulatory bodies in the digital sector, ensuring efficient and effective enforcement of both data protection and competition laws, ultimately benefiting both individuals and businesses.

    Updates from France: 

    1. Data harvesting: EUR 240,000 fine imposed on KASPR by the French data protection authority (CNIL)

    On 5 December, the CNIL imposed a fine of EUR 240,000 on KASPR. The company, which markets a paid extension for the Chrome browser that allows its customers to obtain the professional contact details of people whose profiles they visit on the LinkedIn social network, was found to have collected contact details of LinkedIn users who had opted to limit their visibility. The identified breaches include: 

    • Failure to comply with the obligation to have a legal basis (Article 6 GDPR); 
    • Failure to comply with the obligation to define and respect a data retention period proportionate to the purpose of the processing (Article 5-1-e GDPR);
    • Failure to comply with the obligation to provide transparency and information to individuals (Articles 12 and 14 GDPR); and
    • Failure to comply with the obligation to respond to requests to exercise the right of access (Article 15 GDPR).

    For more details, you can access the decision here (French only)

    2. CNIL's strategic plan for 2025-2028

    CNIL has unveiled its strategic plan for 2025-2028, focusing on a balanced action between prevention, support and enforcement.

    The plan emphasizes four main areas: 

    • Promoting ethical AI that respects rights: knowledge sharing, expertise, clarification of the legal framework, raising public awareness, control of AI systems.
    • Protecting minors and their data in the digital world: increase the CNIL's presence among children, parents and prescribers, enable minors to exercise their rights, promote responsible use of digital technology and monitor operators.
    • Making everyone a cybersecurity player to strengthen confidence in the digital world: cooperation and coordination, support in dealing with breaches, development of technical solutions, increasing control operations.
    • Implement targeted actions on everyday digital uses: Implementation of the action plan to protect people's privacy called ‘mobile applications' and development of privacy-friendly digital identity systems.

    For more details, you can access the strategic plan here (French only). 

     

    Updates from Spain: 

    1. The Osasuna Football Club has been fined EUR 200,000 for the use of biometric data (facial recognition) to access to stadium 

    In November 2022, the Spanish Data Protection Agency (AEPD) received a complaint against Athletic Club Osasuna (the "ACO") regarding the implementation of a biometric facial recognition system (SBRF) for access control at their stadium. The system, introduced in April 2022, was designed to facilitate the entry of spectators by using facial recognition technology. The biometric data collected included facial vectors, which were used to identify individuals uniquely. The system was optional and served as an alternative to traditional access methods, such as QR code scanning.

    The Spanish Data Protection Authority (the "AEPD") initiated a sanctioning procedure against ACO, resulting in a fine of EUR 200,000 for breaching the principle of data minimization (Article 5.1.c) of the General Data Protection Regulation (GDPR), and the suspension of the biometric facial recognition system and mandated the deletion of all related data. 

    The AEPD argued that: (i) the biometric facial recognition system was not necessary for achieving the intended purpose of controlling access to the stadium, as there were less intrusive alternatives available (such as the use of QR codes) and thus (ii) considered the use of biometric data (which is considered highly sensitive) excessive and disproportionate. 

    The fine which amounted to EUR 200,000 was determined considering the following factors: (i) the nature, severity and duration of the infringement (that is, approximately two years); (ii) the number of individuals affected (the biometric system was installed in eight gates from the stadium); (iii) the potential harm to their rights and freedoms and (iv) the lack of diligence by ACO when assessing the necessity and proportionality of the biometric system which carries a higher risk due to its sensitive nature.

    2. Fine of € 30,000 for breaching the principle of confidentiality for accessing medical history

    On November 30, 2023, a complaint was filed to the Catalan Data Protection Authority (APDCAT) against Badalona Serveis Assistencials, SA ("BSA"). The complainant alleged that a medical professional at BSA had accessed her medical history without authorization on nine occasions between 16 September 2021 and 15 September 2023.

    These accesses were not related to any medical treatment or diagnostic activity and were performed by a professional who had no direct involvement in the data subject's medical treatment. 

    The APDCAT imposed a fine of € 30,000 on BSA for violating data protection regulations, specifically for breaching the principle of confidentiality as outlined in Article 5.1.f of the General Data Protection Regulation (GDPR) which mandates that personal data must be processed in a manner that ensures appropriate security, including protection against unauthorized or unlawful processing. BSA paid € 24,000 in advance, benefiting from a 20% reduction in the fine for early payment and acknowledgment of responsibility.

    The APDCAT concluded that BSA was responsible because: (i) according to the GDPR and the Spanish Organic Law on Data Protection and Guarantee of Digital Rights (LOPDGDD), responsibility for data protection violations correspond to the data controller (in this case, BSA) and that responsibility cannot be shifted to individual employees (even if security measures are imposed); (ii) the security measures imposed by BSA (that is, GesDoHC program for managing medical records) were deemed insufficient; and (iii) the unauthorized accesses were not detected by BSA's internal controls but were only identified following the APDCAT claim.

    It is concluded that the fine was determined based on the severity of the breach, the nature of the data involved (sensitive health data), and the need for the fine  to be effective, proportionate, and dissuasive. The APDCAT also considered BSA's previous data protection violations and the company's annual revenue in calculating the fine.

    Spotlight on Data Related Board Priorities in 2025

    Never before have data related risks been so high up on a board agenda for all companies, whether traditionally data heavy or not:  The Ashurst legal and risk advisory teams consider why cyber readiness and AI awareness and governance should be on board agendas over the coming 12 months.

    Cyber readiness

    The cyber security threat will remain pervasive in 2025 as it continues to evolve. The use of AI and the commercialization of malware/ransomware as a service means that threat actors have a lower bar to entry and their ability to compromise and evade security controls needs to be met with a high level of proficiency, readiness and cyber resilience. In the UK, the National Cyber Security Centre's new CEO announced at the end of November 2024 that there is a widening gap between the exposure and threat we face and the defences that are put in place to protect us. Not enough organisations are implementing the advice, frameworks and guidance, meaning that they are ill-prepared to face a cyber attack.

    Boards must continue to focus on this increasing threat in 2025, including a rise in insider threat for many of our critical infrastructure clients, and a sharp focus on critical third party vulnerabilities and outages, spectacularly demonstrated by the CrowdStrike outage.

    And if that wasn't enough, we anticipate growing disruption, financial loss and reputation damage caused by highly sophisticated mis (and dis) information campaigns.

    In response, we have seen renewed efforts by governments around the world to increase the regulatory obligations related to data, AI, privacy and cyber.

    While in some cases regulatory changes are motivated by improving industry and government partnerships to build national resilience, the majority of regulatory change – and regulator enforcement focus - is introducing a regime of ever-increasing accountability (including personal accountability) for Boards to ensure their organisations are secure.

    The regulatory bar is set very high. Boards are increasingly expected to ensure organisations are taking steps to secure data and systems, including that of their critical suppliers. We anticipate 2025 to be the year of increased regulatory scrutiny.

    Ashurst is working with Boards to redefine cyber readiness and cyber risk governance. This is about building a regulatorily defensible approach to cyber breaches, before they occur.

    In 2025, cyber will clearly be a lot more than "just an IT" issue.

    AI in 2025: Managing regulatory complexity

    Boards must navigate the evolving regulatory landscape of AI with strategic foresight. Here are the critical areas to focus on.

    1. Navigating the AI Hype cycle

    In 2024, businesses rushed to procure and deploy generative AI tools to stay competitive. However, many encountered challenges in discovering valuable use cases and integrating these tools with their systems and data, hampering the ability to drive maximum value from these investments. In 2025, focus should shift to understanding the true capabilities of AI and effectively integrating it into core business processes to unlock its full potential while managing the systemic risks it can introduce.

    2. Maturing AI governance practices

    In response to the rapid adoption of generative AI, many organisations developed AI policies to address its usage. As AI embeds into critical business processes, utilising AI policies to govern and risk manage AI will no longer be enough. Directors should ensure that holistic AI governance frameworks are developed, and that there are adequate governance structures and oversight mechanisms in place to identify, escalate, and monitor AI risks effectively.

    3. The arrival of UK legislation and regulatory guidance

    The UK government is set finally to legislate on AI, targeting companies responsible for the most powerful large language models. What will be of interest to Boards who do not fall within that narrow scope is the sector-specific guidance and regulation expected over the coming months, particularly in the financial services sector. The government's pro-innovation approach expects sector regulators to shape AI regulation within their domains. Boards must stay informed about these regulatory developments and ensure steps are taken to ensure compliance.

    4. International legislation continues to develop

    In contrast to the proposed light-touch regulation and legislation in the UK, other countries have adopted more prescriptive legislation, notably the EU AI Act. This legislation has extra-territorial impact, catching any AI system interacting with the EU and its citizens, making it crucial for multinational companies to identify usage and risk assess AI across their business. Australia's upcoming Privacy Act reform will focus on automated decision-making (ADM) and AI, requiring organisations to map their systems that consume personal data. Even for businesses not caught under the EU's AI Act, the risk classifications used within the legislation are useful tools for assessing risk and meeting other emerging regulatory requirements.

    The information provided is not intended to be a comprehensive review of all developments in the law and practice, or to cover all aspects of those referred to.
    Readers should take legal advice before applying it to specific issues or transactions.