Data Bytes 55: Your UK and European Data Privacy update for February 2025
07 March 2025

07 March 2025
Welcome to this month’s issue of Data Bytes. Just over a month ago, the Data Bytes team hosted our annual data round-up event at our London office. Keep scrolling down to our spotlight section where we have summarised the key updates we spoke about at that event and our predictions for the year ahead.
Back to February, it was a big month for the AI Act with the first operative provisions regarding AI literacy and prohibited systems applying. We also saw the curtains finally close on the draft E-Privacy Regulation and we await to hear whether any e-privacy phoenix arises from those ashes, although it doesn't feel imminent. Here in the UK, the Data (Use and Access) Bill has hit the House of Commons and it feels more and more likely that we will soon have a new data protection piece of legislation in the UK after a number of false starts.
The Data (Use and Access) Bill (the Bill) which was introduced to Parliament in October 2024 completed its House of Lords stages on 5 February and is moving through the House of Commons, currently at the Committee Stage.
The Bill aims to "harness the enormous power of data to boost the UK economy by £10 billion" and "unlock the secure and effective use of data for the public interest".
Although tweaked in structure, for all intents and purposes, it mirrors many concepts and provisions that were in the previously abandoned Data Protection and Digital Information Bill. Notable aspects of the Bill, some of which were introduced or amended following its passage through the House of Lords, are:
We will keep a watching brief on how many of these changes are maintained by the House of Commons.
On 5 February, the ICO launched its 'direct marketing advice generator', a free online tool which aims to help organisations ensure their direct marketing activities comply with the Privacy and Electronic Communication Regulations (PECR) and the UK GDPR.
Following the completion of a short questionnaire, this tool provides tailored compliance advice on marketing activities via email, SMS, direct mail, social media and telemarketing.
Given that marketing activities and PECR non-compliance are the main reason for enforcement action and imposed monetary penalties by the ICO, this will be a useful tool – we recommend that:
On 31 January the UK Government published the Code of Practice for the Cyber Security of AI (the Code) which is accompanied by a supplementary implementation guide (Implementation Guide).
The Code is a set of voluntary principles focused on the cybersecurity of AI systems, including generative AI, and is primarily targeted at system operators, developers and data custodians in the AI supply chain. It is centered on 13 principles which map onto 5 separate phases reflecting the AI lifecycle: Design, Development, Deployment, Maintenance and End of Life.
The UK Government plans to submit the Code and the Implementation Guide through the European Telecommunications Standards Institute, to help create a global future standard of baseline cybersecurity requirements for AI. For more details about the Code, see our briefing insight here.
On 5 February 2025, the ICO published its finalised guidance on the management of employment records to help employers navigate their obligations under UK data protection law. This follows A draft version of the guidance, which has been on the ICO website following a consultation by the ICO in 2023.
The guidance covers various aspects of managing workers' personal information and serves as a useful reminder of common issues that can arise in relation to reliance on consent in an employment context, application of different retention periods to employee data, sharing employment records with third parties.
One area of focus in the guidance is the sharing of information in the context of corporate transactions. During a merger or acquisition, employers are reminded to carefully consider the lawful bases for information sharing as part of due diligence and to maintain transparency with workers where possible. In addition, the ICO notes when sharing information with prospective employers exceeding TUPE requirements, employers must have another lawful basis for processing and appropriate safeguards to ensure unsuccessful bidders only use information in connection with the proposed business transfer.
The Public Authorities (Fraud, Error and Recovery) Bill was introduced to Parliament on 22 January 2025 (the Bill) and under it, the Department for Work and Pensions (the DWP) has powers to recover losses resulting from fraud and error.
The Bill introduces an eligibility verification measure (EVM) which empowers the DWP to give qualifying financial organisations an eligibility verification notice to identify certain accounts which receive benefits, assess the likelihood of incorrect payments and share those account details with the DWP.
The ICO has published its response to EVMs and it’s worth noting that even the government:
As of 2 February, the first rules under the AI Act have started to apply, including the ban on prohibited practices and AI literacy.
Prohibited practices include: harmful AI-based manipulation and deception, harmful AI-based exploitation of vulnerabilities, social scoring, individual criminal offence risk assessment or prediction, untargeted scraping of the internet or CCTV material to create or expand facial recognition databases, emotion recognition in workplaces and education institutions, biometric categorisation to deduce certain protected characteristics and real-time remote biometric identification for law enforcement purposes in publicly accessible spaces (Art. 5 AI-Act). For more detail read here.
Providers and deployers of AI systems must also take measures to ensure that their staff and anyone using the systems on their behalf have a sufficient level of AI literacy (Art. 4 AI Act). This includes training on the ethical and safe usage of AI systems for all employees who interact with AI systems in their professional roles and key training topics include AI fundamentals, ethical and societal considerations, corporate responsibility, liability conditions, EU regulations, and potential discrimination by AI.
On 4 February, the European Commission published its draft "Guidelines on Prohibited AI Practices" (Art. 5 AI Act), to provide clear and practical guidance. The Guidelines are designed to ensure the consistent, effective, and uniform application of the AI Act across the EU. The European Commission has approved the draft Guidelines but has not yet formally adopted them.
The Guidelines provide background, objectives and an overview of prohibited AI practices. They give information on the material and personal scope of the AI Act and the exclusions from the scope of the AI Act. The Guidelines contain vast information on each prohibited practice (Art. 5 AI Act) in a dedicated chapter on the following rationale and objectives, main concepts and components of the prohibition, interplay between the prohibitions, out of scope, interplay with other Union law.
On 3 February the European Commission published an updated version of its frequently asked questions (FAQs) on the EU Data Act. It further develops the Data Act FAQ from 13 September 2024.
The European Commission highlights "the level of enrichment of the data is one of the key factors in achieving a balanced and fair allocation of data value". The European Commission clarifies in particular that the exclusion of "content" from the Data Act (Recital 16) refers to something akin to copyrightable material. For example, "data holders of digital cameras capable of recording, transmitting, or displaying photographs or video are required to share readily available data, such as usage patterns, battery charging levels, timestamps, location, light levels, and event logs." Data holders are, in principle, not obliged to share the audiovisual content itself. Similarly, users do not have the right to request access to and use of a motion picture/film displayed on a smart TV.
The European Commission elaborates that applying privacy enhancing technologies (PETs) to achieve anonymisation or pseudonymisation does not automatically result in such data being treated as ‘derived’ or ‘inferred’ data. Anonymisation or pseudonymisation can be relevant, for instance, where the data holder must respond to a data access and data sharing requests (Art. 4 or 5 Data Act). This can be of particular relevance where the user requesting access is not the only data subject concerned, or where there are several data subjects who may all be users of the same connected product (e.g. a rented car).
On 13 February the CJEU ruled that the term "undertaking" (which is used to calculate the maximum amount of administrative fines under Art. 83 GDPR aligns with and should be understood in line with the concept used in EU competition law.
"Undertaking" means any entity engaged in economic activity, regardless of its legal status or financing. Data protection authorities are now better equipped to impose fines that reflect the true economic capacity of infringing entities. The CJEU emphasises that fines must consider the economic unit, ensuring they are effective, proportionate, and dissuasive.
On 11 February the European Commission announced, via an annex to its 2025 work program, its intention to withdraw the ePrivacy Regulation, citing that “no agreement is expected from the co-legislators. Furthermore, the proposal is outdated in view of some recent legislation in both the technological and the legislative landscape”.
Whilst this will not be a surprise to most, given that it was supposed to come into force at the same time as the GDPR in 2018, it will mean that the existing legislation on electronic communications will continue to be the outdated ePrivacy Directive. Not only will this continue to result in inconsistent application and enforcement across the EU, but it will also mean that data protection lawyers must continue to advise on compliance with legislation from 2009 which simply does not work for current and evolving technologies.
On 19 December 2024, the CNIL (French Data Protection Authority) fined a company in the real estate industry €40,000 for excessive surveillance of its employees, namely:
On 7 February, the CNIL published recommendations on how controllers respect and facilitate data subjects’ rights when their personal data is used to develop and AI system. These recommendations offer concrete and proportionate solutions to inform and facilitate the exercise of individuals' rights recognising that the fulfilment of data subjects’ rights should not impede innovation in AI. Key points to note:
The CJEU gave judgement on 27 February 2025 in a case concerning a request for information about the logic applied in an automated decision concerning an assessment of credit. The relevant data subject (“CK”), was denied a mobile phone contract with a telecom operator based on an automated credit assessment undertaken by Dun & Bradstreet (“D&B”). Following a request from CK, the Austrian Data Protection Authority ordered D&B to provide information on the logic applied in the automated decision making process but D&B refused arguing that it was protected by commercial secrecy. The matter was ultimately escalated for a preliminary ruling from the CJEU.
The CJEU concluded that under Article 15(1)(h) of the GDPR, individuals have the right to obtain "meaningful information about the logic involved" in automated decision-making processes, including profiling. This right includes an explanation of the procedure and principles applied to process personal data to achieve a specific result, such as a credit score. The Court also determined that if the information includes third-party data or trade secrets, it should be communicated to the competent authority or court, which will balance the rights and interests involved.
Organisations undertaking or relying on automated decisions will need to ensure they have sufficient understanding of the decision making models to provide adequate transparency. This includes knowledge of the extent to which a variation in personal data taken into account would have led to a different result.
In a decision of 18 December 2024, the German Court of Appeal in Schleswig set a new standard, requiring end-to-end encryption where companies send B2C invoices, deferring to the need for technical and organisational measures to protect consumers against fraudulent third-party access (Art. 32 GDPR). The practical implications could be significant as end-to-end encryption is not yet a standard practice in B2C communication.
The Spanish data protection authority (“AEPD”) imposed a fine of EUR 1,200,000 on ORANGE ESPAGNE, S.A.U. (“Orange”) following a complaint filed by a data subject regarding the unauthorized duplication of a SIM card. On November 15, 2022, the data subject discovered that their SIM card had been duplicated without proper identity verification, leading to identity theft and a financial loss of €9,000. The duplication was carried out by two agents at an Orange store in Madrid, who used their internal system credentials to activate the duplicate SIM card fraudulently.
The AEPD concluded that Orange failed to implement adequate technical and organizational measures to ensure the protection of personal data and breached Article 6 of the GDPR (lawfulness of processing) because as the SIM card was duplicated without the claimant's consent or any other legal justification. In addition, the AEPD found Orange failed to comply with Article 25 of the GDPR (data protection by design and by default) of the GDPR because the measures in place were not sufficient to prevent the fraudulent duplication of SIM cards.
The decision serves as a reminder for organisatons about the importance of having robust measures and documented processes in place to protect personal data and prevent identity theft.
On 30 January, we hosted our annual data protection round-up event at our London office with speakers from our UK, French, German, Spanish and Italian teams. Many of our clients were in attendance to hear our review of the key data protection and privacy developments from the UK and EU in the past 12 months as well as our predictions for the year ahead.
We have summarised below the four key takeaways from our multi-disciplinary data team who spoke at the event. If you would like further details on any of the points discussed below, please get in touch or view our slides below.
1. Cyber security and cyber resilience: in the last 12 months, we have seen increasing focus on cyber security and cyber resilience, for example in the UK the ICO has been engaging with organisations who may be subject to the NIS regime as RDSPs and the UK government, following a consultation, is preparing a cyber resilience bill and consulting on a new ransomware reporting regime. There has also been an increase in enforcement actions taken in the UK and Ireland relating to personal data breaches. In the EU, there is a suite of cyber legislation on the horizon including the national implementation of NIS2 which will continue bringing with it increased obligations and prospective of large financial penalties.
What does this mean? Given the fast pace of new legislation in this area, businesses should stay informed on regulatory changes and relevant data protection regulator guidance. Businesses should also develop internal cyber readiness strategies and ensure relevant procedures and internal measures are up-to-date and accurate.
2. Heightened expectations around accountability documentation: there has been increased enforcement activity in the EU for controllers of personal data where accountability documentation (e.g., ROPAs, DPIAs, and LIAs) is not in place, especially if that documentation is required due to high-risk processing, for example businesses which process personal data using AI. Our experience is that accountability documentation is one of the first things a data protection regulator will ask for when engaging with businesses.
What does this mean? Up-to-date, complete, and accurate accountability documentation is one of the best tools a business has to mitigate data protection risks. Businesses should therefore consider dusting off any outdated accountability documentation and/or ensuring compliance gaps are plugged by putting in place any documentation which is currently missing.
3. Focus on supply chain responsibilities: over the last few years a complex web of regulation has created overlapping obligations on various participants in supply chains. Therefore, guidance from the EDPB which clarifies processor and sub-processor obligations and which sets out the regulators' expectations of a risk-based approach to conducting sub-processor due diligence was a welcome end to 2024. We discuss this further in our October 2024 data bytes.
What does this mean? Businesses that are controllers of personal data should consider auditing their supply chains and review processes to ensure that they are fully documented, and up-to-date. Businesses that are processors of personal data should be prepared for more information requests from controllers / other processors in the supply chain and tighter contractual wording to comply with the EDPB's positions.
4. Future of the regulatory environment: the current regulatory environment is complex, to say the least, but is also showing signs of fracturing. Global businesses are currently facing overlapping regulatory remits, divergent interpretation of data protection laws, and questions marks over longevity of the EU-US adequacy decisions. It is unsurprising, therefore, that pressure is being placed on both the EU and UK to promote growth and innovation, which has led to calls for 'de-regulation' and also several legislative initiatives in the EU being shelved such as the AI liability directive and the E-privacy regulation.
What does this mean? As we look at the next 12 months, the future is uncertain. We will be closely following regulatory developments – if you have any questions, please reach out.
The information provided is not intended to be a comprehensive review of all developments in the law and practice, or to cover all aspects of those referred to.
Readers should take legal advice before applying it to specific issues or transactions.