Data Bytes 53: Your UK and European Data Privacy update for November 2024
12 December 2024
12 December 2024
Welcome to our last edition of Data Bytes of 2024, where the Ashurst UK and European Data Privacy and Cyber Security Team look to summarise the key privacy legal and policy developments of the previous month.
We're not going to attempt to summarise the full raft of data law related updates of 2024 here. Please register for our in-person round-up at our London Ashurst HQ to hear from our team of data, privacy, cyber and litigation experts on the key changes and developments of 2024 and some naval gazing on what to expect in 2025.
In this final update of the year you will find another AI-weighted month, with DSIT launching its AI management essentials tool, Ofcom publishing an open letter to online service providers about how the UK's Online Safety Act will apply to generative AI and chatbots and the ICO issuing an audit outcomes report titled "AI tools in recruitment" and publishing key considerations for organisations looking to procure AI tools for recruitment purposes. It's also been a big month for the AI data team at Ashurst with the launch of the Ashurst AI Assess. In collaboration with our Risk Advisory experts and our Ashurst Advance digital team, we have developed a defensible, repeatable and traceable approach to assessing AI systems and use cases to determine their scope under the AI Act, and outline the resulting compliance requirements. The tool is a standardised, consistent and auditable approach to AI Act compliance, providing documented evidence of the assessment and its results. Contact Matthew Worsfold for a demo and to learn more.
Meanwhile, at the end of November, our Australian and UK data and cyber teams combined forces with a joint event held at the Australian High Commission in London exploring what the latest cyber and privacy reforms mean for organisations operating across the UK and Australia. The session was opened by His Excellency Stephen Smith, Australian Commissioner, and the panel included Rachael Falk, Chief Executive of the Australian Cyber Security Cooperative Research Centre, Ciaran Martin, Professor of Cyber Security at Oxford University, Sam Symth Murray, Principal Policy Advisor - Data Protection, Information Commissioner's Office, and our own Amanda Ludlow, Rhiannon Webster and John Macpherson.
Key takeaways included:
Scroll down to our spotlight section to see our take on Navigating data protection and cyber security reforms in the UK and Australia.
The ICO announced on 4 November 2024 it is seeking permission to appeal the judgement of the Upper Tribunal on DSG Retail Limited (DSG) to the Court of Appeal on the basis that it believes the Upper Tribunal incorrectly interpreted the law regarding data security obligations in connection with pseudonymised personal data.
Specifically, the ICO disagrees with the Upper Tribunal's finding that an organisation is not required to take appropriate measures against unauthorised or unlawful processing of data by a third party, where the data is personal data in the hands of the controller but not in the hands of the third party. The case relates to DSG's fine of £500,000 by the ICO in 2020 after a cyber attack involving data exfiltration impacting at least 14 million people. The data in question was the card information of DSG customers such as the PAN (16 digit credit card number) and expiry date. The Upper Tribunal agreed with DSG's arguments that where the PAN and expiry data for a particular card was disclosed without the name of the card holder this did not constitute personal data in the hands of third parties who do not hold other identifiable information about the card holder. The ICO disagrees on the basis that the PAN is personal data in and of itself given it uniquely identifies and individual card holder.The ICO now awaits the Upper Tribunal's decision on their appeal request.
On 6 November 2024, the Department for Science, Innovation & Technology (DSIT) launched a public consultation which introduced its new AI Management Essentials tool (AIME). AIME is a self-assessment tool developed by the DSIT to help businesses implement and demonstrate AI management practices and is based on key principles from existing AI regulations, standards and frameworks (including the EU AI Act).
The tool is mainly intended for SMEs and start-ups, however, it may be used by larger organisation to assess its AI management systems for individual business divisions, departments or subsidiaries. It focuses on evaluating organisational processes rather than the AI products or services themselves and does so using a multiple-choice self-assessment questionnaire. DSIT expects the final version of AIME to include a self-assessment questionnaire, a rating system, and recommendations for improvement.
While not mandatory, use of AIME may help organisations identify strengths and weaknesses in their AI management systems and will provide practical solutions. DSIT hopes that the public consultation on AIME will help ensure the tool is fit for purpose – the consultation ends on 29 January 2025.
In collaboration with the Department for Science, Innovation and Technology (DSIT), the ICO released on 7 November 2024 a Privacy Enhancing Technologies (PETs) Cost-Benefit Awareness tool to aid organisations in understanding and assessing the costs / benefits associated with PETs.
PETs are technologies which increase privacy in areas involving data collection, processing or storing. Examples include homomorphic encryption (allowing for data to be processed without being decrypted) or trusted execution environments (secure areas of a main processor).
The tool developed by DSIT and the ICO focuses mainly on emerging PETs and demonstrates a continued effort to drive greater uptake of PETs across the UK. Organisations can utilise the tool as a framework to analyse the costs and benefits associated with a range of PETs. Guidance on compliance costs and benefits are also provided to illustrate the value add of PETs in reducing risk to individuals and driving down long term compliance costs.
On 8 November, Ofcom published an open letter to online service providers about how the UK's Online Safety Act (OSA) will apply to generative AI (GenAI) and chatbots. The letter addresses recent incidents of online harm involving the use of GenAI – for example, where GenAI chatbots have acted as virtual clones of real people and deceased children.
The letter explains how GenAI tools and chatbots will be classified as user-to-user services or as search services. Amongst other things, it confirms:
The letter serves as a reminder to organisations to comply with their duties under OSA, including by completing Illegal Harms Risk Assessments by mid-March 2025. Ofcom warns companies that failure to comply could result in financial penalty. To support with risk assessment preparedness, Ofcom will publish its final Illegal Harms Risk Assessment Guidance in December this year, which is currently available in draft form.
The ICO published on 15 November 2024 a joint statement, along with the Financial Conduct Authority (FCA) and The Pension Regulator (TPR), to provide additional clarity for retail investment firms and pension providers on how direct marketing rules interact with requirements to support customer decision-making.
The statement focused on how firms and pension providers can balance separate communication obligations (as mandated under the FCA's consumer duty and the TPR Code and Guidance) in situations where customers have withheld marketing permissions. The key takeaway from the ICO is that communication obligations can be conducted in a compliant manner so long as such communication does not constitute direct marketing. To achieve this firms and pension providers should ensure that only facts are communicated and the language used should be neutral and should avoid active promotion or encouragement.
Whilst the guidance goes some way to providing regulatory clarity, legal and compliance teams will need to carefully review proposed communications given the ICO's continued focus on contextual factors such as tone and active 'encouragement'.
On 7 November 2024, the ICO published a report on the regulatory and privacy issues raised by the wider development of genomics. This report builds on the ICO's 2023 Tech Horizons report which flagged initial areas of concern around polygenic risk scores (assessments derived from genomics that could inform how health care and other services are delivered). The ICO report states that the UK Government is focussing on a drive to deliver a "genomic-focused" healthcare system and there is similar interest, in the private sector such as by organisations operating in he wellbeing, direct-to-consumer, and sports sectors.
The ICO notes that organisations should consider personally identifiable genomic data to be personal data and should carefully consider if it is special category data, given the nature of the intended processing. Although this is helpful in principle, this places a large burden onto organisations to carefully consider processing genomic data in the context of the organisations broader aims and ambitions. The ICO's report makes various assumptions in defining genomic data and therefore further clarity will be required for businesses that process it.
On 6 November 2024, the ICO issued an audit outcomes report titled "AI tools in recruitment" and published key considerations for organisations looking to procure AI tools for recruitment purposes. The report is the culmination of the ICO's engagement with developers and providers of AI powered processes used in recruitment (such as screening and selection tools). The ICO found a number of areas for improvement in the management of privacy risks and compliance with data protection laws. In particular, organisations keep in-mind the ICO recommendations concerning completion data protection impact assessments, having clear contracts in place with AI providers which designate controller and processor roles, and ensuring there is appropriate transparency and bias mitigation measures in place.
On 20 November 2024, the Cyber Resilience Act was published in the Official Journal of the European Union. It will be applicable in full from December 2027. This regulation applies to products with digital elements made available on the market and lays down rules for ensuring the cybersecurity of such products. Cybersecurity requirements set out in this regulation includes data protection measures, like ensuring the integrity and the confidentiality of the data stored, transmitted or processed.
On 18 November 2024, the new directive on liability for defective products was published in the Official Journal of the European Union. It will become applicable in December 2026. This directive repeals and replace the original directive on liability from defective products of 1985, to take into account the development of artificial intelligence. The new directive will apply to software, including AI systems.
a landmark decision, the Irish Data Protection Commission (DPC) has imposed a substantial fine of EUR 310 million on LinkedIn for various breaches of the GDPR, including: unlawful processing of personal data for targeted advertising without the explicit consent or another valid legal basis, failure to provide clear and transparent information to users about how their data was being processed (Articles 12-14 GDPR), and inadequate technical and organizational measures (Article 32 of the GDPR).
In preparation of the Data Act (which will enter into force on 12 September 2025), the EU Commission is offering a series of 6 webinars with businesses, policymakers, legal and other experts to discuss progress on the draft Standard Contractual Clauses (SCCs) for cloud computing contracts and the draft Model Contractual Terms (MCTs) for data sharing. The webinars are open to Member State authorities, companies, legal practitioners, academics, and other organisations. You can register via the links under Series of webinars: The Data Act in contracts | Shaping Europe’s digital future to take part in the discussions and help shape the contracts.
1. A fine of 120,000 euros is imposed to BBVA for the unauthorized deletion of personal data
2. Breach of the right of access by Telefónica by providing incomplete information to a request
On 20 November 2024, the German Federal Court of Justice (BGH) has awarded an amount of EUR 100 as a non-material damage for "loss of control" (even if just for a short period), where a user's personal data (phone number associated to user accounts on Facebook) has been freely accessible on the Internet (BGH, VI ZR 10/24). Unknown third parties had used Facebook's contact import function to associate phone numbers with user accounts through the input of randomized number sequences. As a result, they extracted personal data allocating Facebook profiles with the respective phone number. These personal data were subsequently published.
The BGH provides important guidance how to process personal data within AI systems:
Despite the implementation deadline for the NIS2 and CER Directive on 17 October 2024, the German legislator has not passed the respective implementation acts in time. After the German government collapsed on 6 November 2024 and Federal elections are scheduled for 23 February 2025, experts agree that the implementation of the NIS2 should proceed quickly in any event. Notably, once the NIS2 is implemented to national law, providers of essential and important services will have no further transitional period to have all required processes and safeguards in place.
For further guidance, on 17 October 2024, the EU Commission has adopted an implementing regulation that lays down the technical and methodological requirements of the measures referred to in NIS2 with regard to DNS service providers, TLD name registries, cloud computing service providers, data centre service providers, content delivery network providers, managed service providers, managed security service providers, providers of online market places, of online search engines and of social networking services platforms, and trust service providers (C (2024) 7151). The EU Commission provides details on cybersecurity risk management measures for companies providing digital infrastructures and services, as well as assessing security incidents and identifying notification obligations. For each category of service providers, the implementing regulation specifies when an incident is considered significant, to whom it needs to be reported and in which timeframe.
The regulatory landscape for data protection and cyber security is evolving rapidly, with Australia introducing significant legislative changes aimed at enhancing privacy and cyber security and the UK currently considering reforms to its laws on the same topics.
The UK government's proposed reforms will be divided across two pieces of legislation. Firstly, amendments to existing data protection laws are included in the Data (Use and Access) Bill (DUA Bill), introduced to Parliament on 23 October 2024 and is currently being debated in the House of Lords. The DUA Bill is an attempt to optimise the use of data for economic growth and public service improvements (see our Article from last month for full details on this). On the cyber security front, the Government announced in July this year the forthcoming Cybersecurity and Resilience Bill, which is intended to expand regulatory scope of existing cross sectoral cyber security obligations so they apply to more types of businesses, enhance incident reporting obligations, and strengthen enforcement of cybersecurity standards.
From an Australian perspective, privacy and cyber security have been a key regulatory priority following a series of major data breaches in the past few years. With a federal election expected in 2025, the Australian Government accelerated its reform agenda, passing important reforms in the closing Parliamentary sitting weeks of November 2024:
We have summarised below three aspects from the reforms in the UK and Australia which were discussed at our recent event at in the Australian High Commission in London and we believe are likely to have a business impact for organisations operating in both jurisdictions. A more expansive deep dive on these topics can be found in this article.
Automated Decision Making. The first reform relates to automated decision making where the UK is relaxing certain requirements and Australia is introducing requirements concerning transparency for the first time. These reforms represent an opportunity for organisations operating in both the UK and Australia to identify and assess the types of automated decisions they are undertaking. From an Australian perspective, the transparency measures get the ball rolling – further obligations will apply to these automated decisions in a coming "tranche 2" of reforms – for example, greater privacy impact assessments requirements.
Cookies, tacking technologies and direct marketing. The second reform relates to cookies, tacking technologies and direct marketing where the UK enforcement landscape is fundamentally changing through the introduction of 'GDPR level' fines for breaches of requirements. Whilst there is currently no standalone legislation in Australia specifically regulating cookies and similar tracking technologies, personal data collected via the use of cookies is subject to existing Australian privacy law (Privacy Act 1988 (Cth) and in November the Office of the Australian Information Commissioner published guidance which sets out general considerations for private sector organisations that use third-party tracking pixels on their websites. We expect to see additional enforcement action in these key areas from the Australian Privacy Commissioner, particularly given that reforms to the Privacy Act have granted the Privacy Commissioner with greater enforcement powers, including the ability to issue infringement notices for ‘administrative failures’ such as a failure to have a compliant privacy policy.
Cyber security and response management. The third reform is an increased focus on cyber security and response management. Following significant data breaches and attacks, Australia is introducing ground-breaking legislation including mandatory ransomware reporting where payments are made by organisations(subject to exemptions for smaller businesses) as well as a range of reforms relating to the protection of critical infrastructure assets. The UK is similarly looking to significantly uplift its cross-sectoral cyber-security regulation such as by broadening the scope of application to more digital services and supply chains and introducing stricter incident reporting requirements. The reforms in both jurisdictions demonstrate that governments and regulators are placing greater responsibility on boards and senior leaders to demonstrate robust cyber security measures and meet expectations concerning cyber response management.
Authors: Rhiannon Webster, Partner; Nicolas Quoy, Partner; Alexander Duisberg, Partner; Andreas Mauroschat, Partner; Geoff McGrath, Partner Cristina Grande, Counsel; Tom Brookes, Senior Associate; Shehana Cameron-Perera, Senior Associate; Antoine Boullet, Senior Associate; Emily Jones, Associate; Lisa Kopp, Associate; David Plischka, Associate; Maria Baixauli, Junior Associate; Anne Wecxsteen, Junior Associate, Hana Byrne, Junior Associate; Andrew Clarke, Solicitor; Prithivi Venkatesh, Solicitor, Anne Wecxsteen, Trainee Solicitor; Rachael Peter, Trainee Solicitor, Nick Hwong Trainee Solicitor, Sebastian Mayr, Student Assistant.
The information provided is not intended to be a comprehensive review of all developments in the law and practice, or to cover all aspects of those referred to.
Readers should take legal advice before applying it to specific issues or transactions.