Data Bytes 52: Your UK and European Data Privacy update for October 2024
07 November 2024
07 November 2024
Welcome back to Data Bytes. It's been a bumper month for privacy and cyber security law updates on both sides of the English channel. UK side we saw the UK Government publishing its Data (Use and Access) Bill, promising that it will "harness the enormous power of data to boost the UK economy by £10 billion" and "unlock the secure and effective use of data for the public interest". Although tweaked in structure, the DUA mirrors many of the concepts and provisions that were in the previous Government's abandoned Data Protection and Digital Information Bill ("DPDI Bill") or makes subtle changes. See our spotlight section below for our summary of the key legislative changes it proposes.
Over to Europe, the deadline for member states passing the European Union’s updated Network and Information Systems Directive (Directive (EU) 2022/2555) (NIS 2 or Directive) into national law passed on 18 October 2024, with many organisations operating in or servicing the EU market face significant new cybersecurity obligations.
Finally the EDPB and CJEU had a busy few weeks publishing some significant decisions and opinions. Breaking with tradition, we lead this month with those European cases and guidance. Although not directly applicable to UK operations it would be uncharacteristic of the UK ICO and courts to stray far away from their core messages.
On October 9, following a request from the Danish Data Protection Authority, the European Data Protection Board (EDPB) adopted an Opinion on the responsibility of controllers when relying on processors and sub-processors
The Danish DPA had posed several questions to the EDPB, focusing on scenarios where a controller engages a processor, that in turn engages other (sub-)processors. The questions addressed various aspects of such processing chains and the related obligations of controllers.
They key takeaways for organisations from this opinion are:
This opinion, although not law, is binding immediately. We would recommend that controller organisations consider an audit of their processing chains, looking at the requirement of "sufficient guarantees" and "transfers of personal data" and review processes to ensure these are fully documented and contractual wording tightened up where required. Processor organisations should be prepared for more information requests and tighter contractual wording dealing with these requirements.
October was a seminal month for the last, but certainly not least of the lawful bases. The legitimate interest lawful basis at Article 6 (1) (f) of the GDPR is the most flexible of the lawful bases, controllers can rely on for processing personal data. It requires that the processing of personal data is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data, in particular where the data subject is a child.
Its use for pure commercial purposes had been thrown into question by the Dutch DPA in the case of the Dutch Tennis Association Koninklijke Nederlandse Lawn Tennisbond (KNLTB) (C-621/22). The KNLTB had disclosed their members' personal contact data without their consent, to tennis equipment sponsor and the Dutch Lottery Organisation in order for them to promote their products and games to members. The KNLTB had argued legitimate interest as the legal basis, objecting to a fine raised by the Dutch data protection authority. In its original fine decision, the Dutch DPA considered the KNLTB could not rely on the legitimate interest ground to share members’ personal data with sponsors, as the KNLTB’s purpose was purely commercial insisting that purely commercial interests cannot qualify as legitimate interests.
On the 4 October the CJEU handed down its judgment in this case and called out that Recital 47 GDPR, which confirms that a legitimate interest does not need to be established in law, and that, in general, direct marketing purposes may qualify as legitimate interests.
Following swiftly on from this judgment, the EDPB, On October 8, the EDPB issues draft guidance on the processing of personal data based on legitimate interest. The EDPB stresses that this lawful ground should be interpreted restrictively and not be seen as an “open door” to legitimise any processing that does not fall under one of the other lawful bases. It requires consideration of three "cumulative conditions." First, "only the interests that are lawful, clearly and precisely articulated, real and present may be considered legitimate." Second, organizations must consider the necessity to process personal data and whether there are "less intrusive alternatives" while also examining the principles of data minimization. Finally, controllers must consider that the legitimate interest does not override the individual's interests and fundamental rights.
The EDPB also provides examples of how the assessment "should be carried out in practice, including in a number of specific contexts such as fraud prevention, direct marketing and information security."
October also saw the CJEU hand down a decision on whether social media platforms such as Facebook can use personal data obtained outside of the platform for personalised ads. The ruling which was a preliminary ruling request by the Austrian Supreme Court (Oberster Gerichtshof) concerned the legality of Meta Platforms Ireland Ltd processing certain personal data of Max Schrems (CJEU C-446/21).
Max Schrems, a Facebook user, claims that Meta Platforms unlawfully processed his personal data for providing him with targeted advertising, including in relation to special category data about his sexual orientation, which Schrems had not disclosed on this Facebook profile, but he had otherwise publicly referred to his homosexuality. The CJEU ruled in his favour. "An online social network such as Facebook cannot use all of the personal data obtained for the purposes of targeted advertising, without restriction as to time and without distinction as to type of data," it said.
The court explained that processing personal data for personalised ads may be permitted if the controller can demonstrate that the use is proportionate and can meet regulatory requirements, such as that personal data is collected and processed lawfully, fairly and in a transparent manner in relation to the data subject; the controller limits the period of the collection of the personal data in question to what is strictly necessary in the light of the objective of the processing; and the personal data are kept only for as long as is necessary for the specific purposes of the collection and processing.
“In any event, the storage of the personal data of the users of a social network platform for an unlimited period for the purpose of targeted advertising must be considered to be a disproportionate interference in the rights guaranteed to those users by the GDPR"
1. German draft Employee Data Protection Act
2. Supervisory Authority issues discussion paper on AI
1. French Data Protection Authority fined two online clairvoyance companies
1. The Spanish Supreme Courts confirms Equifax liability
2. Sanction to HM Hospitales for not implementing adequate security measures
The ICO published on 7 October, a new audit framework designed to help organisation assess and benchmark their own compliance with key requirements under data protection law against ICO expectations.
This new framework extends the ICO's existing Accountability Framework and is comprised of nine accountability toolkits covering, among others, records management, information and cyber security, artificial intelligence and age-appropriate design. The toolkits include a downloadable data protection audit tracker may be particularly useful for companies looking to prepare for corporate or finance transactions where their data protection practices may be subject to external legal due diligence.
On 28 October, the ICO released a joint statement with 16 other data protection authorities highlighting that organisations need to comply with privacy and data protection laws when using personal information, including from their own platforms, when developing AI Large Language Models. In particular the joint statement urged organisations to:
The joint statement follows the publication earlier this year of a consultation by the ICO on the lawful basis for web scraping to train generative AI models where the ICO indicated that legitimate interests may be a valid lawful basis. Although this is helpful in principle, organisations conducting these web scraping activities will still need to ensure they are able to complete an adequate legitimate interest assessment including evidence of how the risks to impacted individuals are meaningfully mitigated.
The National Cyber Security Centre (the "NCSC") issued on 26 September new guidance on the implementation of strong methods of multi-factor authentication ("MFA") for accessing corporate online services. The increasing amount of sensitive corporate data being stored in cloud based online services was noted by the NSCS as a key driver for releasing the guidance. The NCSC describes five MFA methods in the guidance and noted that FIDO2 credentials were the most secure method and recommended message based methods such as email and SMS are only used in the last resort.
On Wednesday 23 October, the UK Government published its Data (Use and Access) Bill ("DUA"), promising that it will "harness the enormous power of data to boost the UK economy by £10 billion" and "unlock the secure and effective use of data for the public interest". Although tweaked in structure, the DUA mirrors many of the concepts and provisions that were in the previous Government's abandoned Data Protection and Digital Information Bill ("DPDI Bill") or makes subtle changes. We have described in this article the key points from DUA and summarised below our pick of the top three areas for organisations to watch as the DUA advances through Parliament:
1. Increased scope for leveraging personal data in AI applications? Although the DUA did not provide any clarity on the application of UK GDPR to the use of AI, it did limit the general prohibition on automated decision making currently found in article 22(1) of the UK GDPR to those decisions which are "significant" and based entirely or partly on the processing of special category data. Notably, the DUA also gives the Secretary of State powers to add new types of special category data which would in effect add those listed in article 9(1) UK GDPR.
The DUA's widening of the permitted use of personal data in automated decision making technologies could, if passed in its current form, open the door for organisations to more easily leverage AI systems and therefore could result in AI products coming to market quicker in the UK in comparison to the EU.
2. EU data access parallels? The DUA draws some parallels with the EU's Data Governance Act and Data Act which came into force in June 2022 and January 2024 respectively (more on those here). For example, the DUA gives powers to the Secretary of State or the Treasury to make provisions on access to customer and business data.
In a bid to avoid diverging with the EU, provisions could look similar to EU legislation (for example, similar to the Data Governance Act and Data Act, the DUA also uses the term "data holders"), however the DUA's data access provisions appear broader in scope than its EU counterparts.
It remains unclear if and when the Secretary of State or the Treasury will implement any new legislation on customer and business data access. Whilst, it is therefore too early to assess the full impact, we will be closely monitoring whether an legislation which does eventually emerge aligns or diverges with the EU.
3. Changes to PECR: The DUA also proposes changes to the Privacy and Electronic Communications Regulations (PECR), including strengthening the ICO's enforcement powers and enabling GDPR level fines to be imposed on PECR violations. If the recent increase in PECR enforcement action wasn't enough to motivate organisations to reconsider any potentially questionable direct marketing practices, the DUA should make this a priority in light of the heightened enforcement and penalty proposals.
A new schedule A1 to PECR would allow cookies to be used without consent in certain circumstances, such as where the technical storage or access to information is strictly necessary to ensure the security of the terminal equipment, to prevent or detect fraud or technical faults, to collect information for statistical purposes to make improvements to the service, to enable the way the website appears or functions for the user or to provide emergency assistance.
Authors: Rhiannon Webster, Partner; Nicolas Quoy, Partner; Alexander Duisberg, Partner; Andreas Mauroschat, Partner; Cristina Grande, Counsel; Tom Brookes, Senior Associate; Shehana Cameron-Perera, Senior Associate; Antoine Boullet, Senior Associate; Emily Jones, Associate; Lisa Kopp, Associate; David Plischka, Associate; Maria Baixauli, Junior Associate; Hana Byrne, Junior Associate; Andrew Clarke, Junior Associate; Anne Wecxsteen, Trainee Solicitor; Rachael Peter, Trainee Solicitor
The information provided is not intended to be a comprehensive review of all developments in the law and practice, or to cover all aspects of those referred to.
Readers should take legal advice before applying it to specific issues or transactions.