Legal development

Data Bytes 55: Your UK and European Data Privacy update for February 2025

Triangular Colorbond profiles

    Data Bytes team

    Welcome to this month’s issue of Data Bytes. Just over a month ago, the Data Bytes team hosted our annual data round-up event at our London office. Keep scrolling down to our spotlight section where we have summarised the key updates we spoke about at that event and our predictions for the year ahead.

    Back to February, it was a big month for the AI Act with the first operative provisions regarding AI literacy and prohibited systems applying. We also saw the curtains finally close on the draft E-Privacy Regulation and we await to hear whether any e-privacy phoenix arises from those ashes, although it doesn't feel imminent. Here in the UK, the Data (Use and Access) Bill has hit the House of Commons and it feels more and more likely that we will soon have a new data protection piece of legislation in the UK after a number of false starts.

    UK Updates

    1. Status of the Data (Use and Access) Bill

    The Data (Use and Access) Bill (the Bill) which was introduced to Parliament in October 2024 completed its House of Lords stages on 5 February and is moving through the House of Commons, currently at the Committee Stage.

    The Bill aims to "harness the enormous power of data to boost the UK economy by £10 billion" and "unlock the secure and effective use of data for the public interest".

    Although tweaked in structure, for all intents and purposes, it mirrors many concepts and provisions that were in the previously abandoned Data Protection and Digital Information Bill. Notable aspects of the Bill, some of which were introduced or amended following its passage through the House of Lords, are:

    • Legitimate interests: a new list of ‘recognised legitimate interests’ which would remove the need for the balancing test, along with clarification on processing activities that are deemed as ‘legitimate interests’.
    • Relaxing of restrictions on Automated Decision-Making (ADM): it intends to remove the general restriction on ADM with legal or significant effects which would widen the permitted use of personal data in ADM technologies.
    • PECR: An increase of the maximum fine that the ICO can impose for PECR violations to align with GDPR monetary fines; removal of the consent requirement for certain purposes; and extension of the soft opt-in rule for the charity sector.
    • A focus on youth protections: the ICO would be required to focus on children's specific data needs and mandate, by way of amendment to article 25 UK GDPR (data protection by design and default requirements), controllers to take account of "higher protection matters" when processing children's data in online services and when assessing what are appropriate technical and organisational measures.
    • Codes of Practice: A mandate of the ICO to develop codes of practice on ADM and AI and educational technology (ed-tech).
    • Web crawling - The imposition of stringent obligations on operators of web crawlers and AI developers when using data scraped from the web to develop, train and fine-tune general-purpose AI models; namely:
    • to be transparent about their use of the data, identity of the web crawlers and purposes for which each crawler is used and to disclose information about the web-crawling including sources of data used;
    • to comply with UK copyright law irrespective of the jurisdiction in which the web-scraping takes place; and
    • to ensure that the exclusion of a crawler by a copyright owner does not negatively impact the findability of the copyright owner’s content in a search engine.
    • Scientific Research Definition: The UK GDPR is amended to introduce a new broad definition of scientific research which is any research that can reasonably be considered scientific, conducted in the public interest, whether public or privately funded or carried out as a commercial or non-commercial activity.
    • Deepfakes: The Sexual Offences Act 2003 is amended to make it an offence for the creation of or solicitation of the creation of a purported intimate image of an adult.

    We will keep a watching brief on how many of these changes are maintained by the House of Commons.

    2. ICO launches new direct marketing advice generator

    On 5 February, the ICO launched its 'direct marketing advice generator', a free online tool which aims to help organisations ensure their direct marketing activities comply with the Privacy and Electronic Communication Regulations (PECR) and the UK GDPR.

    Following the completion of a short questionnaire, this tool provides tailored compliance advice on marketing activities via email, SMS, direct mail, social media and telemarketing.

    Given that marketing activities and PECR non-compliance are the main reason for enforcement action and imposed monetary penalties by the ICO, this will be a useful tool – we recommend that:

    • any small business uses this as its first port of call for marketing activities;
    • Data Privacy Teams in any organisation share the tool with their marketing teams to increase awareness of what compliant marketing looks like.

    3. Cyber security and systems safety – UK code of Practice issued

    On 31 January the UK Government published the Code of Practice for the Cyber Security of AI (the Code) which is accompanied by a supplementary implementation guide (Implementation Guide).

    The Code is a set of voluntary principles focused on the cybersecurity of AI systems, including generative AI, and is primarily targeted at system operators, developers and data custodians in the AI supply chain. It is centered on 13 principles which map onto 5 separate phases reflecting the AI lifecycle: Design, Development, Deployment, Maintenance and End of Life.

    The UK Government plans to submit the Code and the Implementation Guide through the European Telecommunications Standards Institute, to help create a global future standard of baseline cybersecurity requirements for AI. For more details about the Code, see our briefing insight here.

    4. ICO Publishes Finalised Guidance on Processing of Employment records

    On 5 February 2025, the ICO published its finalised guidance on the management of employment records to help employers navigate their obligations under UK data protection law. This follows A draft version of the guidance, which has been on the ICO website following a consultation by the ICO in 2023.

    The guidance covers various aspects of managing workers' personal information and serves as a useful reminder of common issues that can arise in relation to reliance on consent in an employment context, application of different retention periods to employee data, sharing employment records with third parties.

    One area of focus in the guidance is the sharing of information in the context of corporate transactions. During a merger or acquisition, employers are reminded to carefully consider the lawful bases for information sharing as part of due diligence and to maintain transparency with workers where possible. In addition, the ICO notes when sharing information with prospective employers exceeding TUPE requirements, employers must have another lawful basis for processing and appropriate safeguards to ensure unsuccessful bidders only use information in connection with the proposed business transfer.

    5. ICO publishes response to eligibility verification measure granted to DWP

    The Public Authorities (Fraud, Error and Recovery) Bill was introduced to Parliament on 22 January 2025 (the Bill) and under it, the Department for Work and Pensions (the DWP) has powers to recover losses resulting from fraud and error.

    The Bill introduces an eligibility verification measure (EVM) which empowers the DWP to give qualifying financial organisations an eligibility verification notice to identify certain accounts which receive benefits, assess the likelihood of incorrect payments and share those account details with the DWP.

    The ICO has published its response to EVMs and it’s worth noting that even the government:

    • Must determine whether this EVM is necessary, proportionate and fair; the ICO welcomed the narrowed scope of data that can be shared under EVM; and
    • Must balance its legitimate aim (i.e. recovery of lost monies from fraud and error) with individuals’ rights

    EU Updates

    1. Rules on prohibited practices and AI literacy under the AI Act now applicable

    As of 2 February, the first rules under the AI Act have started to apply, including the ban on prohibited practices and AI literacy.

    Prohibited practices include: harmful AI-based manipulation and deception, harmful AI-based exploitation of vulnerabilities, social scoring, individual criminal offence risk assessment or prediction, untargeted scraping of the internet or CCTV material to create or expand facial recognition databases, emotion recognition in workplaces and education institutions, biometric categorisation to deduce certain protected characteristics and real-time remote biometric identification for law enforcement purposes in publicly accessible spaces (Art. 5 AI-Act). For more detail read here.

    Providers and deployers of AI systems must also take measures to ensure that their staff and anyone using the systems on their behalf have a sufficient level of AI literacy (Art. 4 AI Act). This includes training on the ethical and safe usage of AI systems for all employees who interact with AI systems in their professional roles and key training topics include AI fundamentals, ethical and societal considerations, corporate responsibility, liability conditions, EU regulations, and potential discrimination by AI.

    2. The European Commission issues its Guidelines on Prohibited AI Practices

    On 4 February, the European Commission published its draft "Guidelines on Prohibited AI Practices" (Art. 5 AI Act), to provide clear and practical guidance. The Guidelines are designed to ensure the consistent, effective, and uniform application of the AI Act across the EU. The European Commission has approved the draft Guidelines but has not yet formally adopted them.

    The Guidelines provide background, objectives and an overview of prohibited AI practices. They give information on the material and personal scope of the AI Act and the exclusions from the scope of the AI Act. The Guidelines contain vast information on each prohibited practice (Art. 5 AI Act) in a dedicated chapter on the following rationale and objectives, main concepts and components of the prohibition, interplay between the prohibitions, out of scope, interplay with other Union law.

    3. European Commission publishes an update to its EU Data Act FAQs

    On 3 February the European Commission published an updated version of its frequently asked questions (FAQs) on the EU Data Act. It further develops the Data Act FAQ from 13 September 2024.

    The European Commission highlights "the level of enrichment of the data is one of the key factors in achieving a balanced and fair allocation of data value". The European Commission clarifies in particular that the exclusion of "content" from the Data Act (Recital 16) refers to something akin to copyrightable material. For example, "data holders of digital cameras capable of recording, transmitting, or displaying photographs or video are required to share readily available data, such as usage patterns, battery charging levels, timestamps, location, light levels, and event logs." Data holders are, in principle, not obliged to share the audiovisual content itself. Similarly, users do not have the right to request access to and use of a motion picture/film displayed on a smart TV.

    The European Commission elaborates that applying privacy enhancing technologies (PETs) to achieve anonymisation or pseudonymisation does not automatically result in such data being treated as ‘derived’ or ‘inferred’ data. Anonymisation or pseudonymisation can be relevant, for instance, where the data holder must respond to a data access and data sharing requests (Art. 4 or 5 Data Act). This can be of particular relevance where the user requesting access is not the only data subject concerned, or where there are several data subjects who may all be users of the same connected product (e.g. a rented car).

    4. Concept of "undertaking" in Art. 83 GDPR (CJEU C-383/23)

    On 13 February the CJEU ruled that the term "undertaking" (which is used to calculate the maximum amount of administrative fines under Art. 83 GDPR aligns with and should be understood in line with the concept used in EU competition law.

    "Undertaking" means any entity engaged in economic activity, regardless of its legal status or financing. Data protection authorities are now better equipped to impose fines that reflect the true economic capacity of infringing entities. The CJEU emphasises that fines must consider the economic unit, ensuring they are effective, proportionate, and dissuasive.

    5. ePrivacy Regulation to be withdrawn by the European Commission

    On 11 February the European Commission announced, via an annex to its 2025 work program, its intention to withdraw the ePrivacy Regulation, citing that “no agreement is expected from the co-legislators. Furthermore, the proposal is outdated in view of some recent legislation in both the technological and the legislative landscape”.

    Whilst this will not be a surprise to most, given that it was supposed to come into force at the same time as the GDPR in 2018, it will mean that the existing legislation on electronic communications will continue to be the outdated ePrivacy Directive. Not only will this continue to result in inconsistent application and enforcement across the EU, but it will also mean that data protection lawyers must continue to advise on compliance with legislation from 2009 which simply does not work for current and evolving technologies.

    Updates from France

    1. CNIL cracks down on employee surveillance

    On 19 December 2024, the CNIL (French Data Protection Authority) fined a company in the real estate industry €40,000 for excessive surveillance of its employees, namely:

    • Excessive employee surveillance: The video surveillance system, consisting of two cameras, continuously captured real-time audio and video of employees in their workspace, which also served as a break area. The company were found to be unable to justify the continuous nature of the surveillance which violated the data minimisation principle.
    • Implementation of workstation surveillance software: The software was used to measure work time and productivity and tracked periods of "inactivity" whereby there had been no keyboard or mouse activity for 3-15 minutes. These were used as a justification for pay deduction if the employee was unable to justify these ‘inactive’ periods despite the software being unable to identify that periods without computer use could still be productive (e.g., meetings, phone calls). The intrusive nature of the software which captured regular screenshots, personal data was found to have violated employees' privacy rights and lacking an appropriate legal basis.
    • Failure to inform employees: The company did not provide sufficient written information to employees about the implemented surveillance software; verbally informing employees was deemed insufficient due to its undocumented nature; thereby the company had not fulfilled its transparency obligations.
    • Data security failure: The company allowed shared access to an administrator account for viewing surveillance data, which compromised traceability and security (Article 32 GDPR).
    • Failure to conduct a data protection impact assessment.

    2. CNIL published new recommendations on facilitating data subject rights in AI systems

    On 7 February, the CNIL published recommendations on how controllers respect and facilitate data subjects’ rights when their personal data is used to develop and AI system. These recommendations offer concrete and proportionate solutions to inform and facilitate the exercise of individuals' rights recognising that the fulfilment of data subjects’ rights should not impede innovation in AI. Key points to note:

    • The CNIL acknowledges that the responses to data subject rights will differ depending on whether they relate to training data or to data ingested by the model when deployed and recommends that the response should address this.
    • When personal data is used to develop and train an AI model and may potentially be memorised by it, the individuals concerned must be informed. The CNIL acknowledges that methods of informing individuals about the use of their personal data in AI models can be adapted based on risks and operational constraints. The GDPR allows for general information to be provided in certain cases, such as when AI models are built from third-party data and direct contact with individuals is not feasible. For models using multiple data sources, providing overall information about the sources is generally sufficient.
    • The CNIL acknowledges that whilst, a controller may not be able to identify the data subjects on its own (due to the nature of training sets), this does not mean that such identification is impossible and therefore controllers should allow data subjects to provide additional information (e.g. an image or a pseudonym such as a username under which the data subject has online activity) to assist with the identification process.
    • When web scraping, the CNIL encourages technical solutions such as opt-out mechanisms which would facilitate compliance with objection requests.
    • Stakeholders must pay particular attention to personal data in training datasets by striving to anonymize models when it does not conflict with the intended objective; and developing innovative solutions to prevent the disclosure of confidential personal data by the model.

    3. The CJEU rules on automated data decisions and the right to meaningful information

    The CJEU gave judgement on 27 February 2025 in a case concerning a request for information about the logic applied in an automated decision concerning an assessment of credit. The relevant data subject (“CK”), was denied a mobile phone contract with a telecom operator based on an automated credit assessment undertaken by Dun & Bradstreet (“D&B”). Following a request from CK, the Austrian Data Protection Authority ordered D&B to provide information on the logic applied in the automated decision making process but D&B refused arguing that it was protected by commercial secrecy. The matter was ultimately escalated for a preliminary ruling from the CJEU.

    The CJEU concluded that under Article 15(1)(h) of the GDPR, individuals have the right to obtain "meaningful information about the logic involved" in automated decision-making processes, including profiling. This right includes an explanation of the procedure and principles applied to process personal data to achieve a specific result, such as a credit score. The Court also determined that if the information includes third-party data or trade secrets, it should be communicated to the competent authority or court, which will balance the rights and interests involved.

    Organisations undertaking or relying on automated decisions will need to ensure they have sufficient understanding of the decision making models to provide adequate transparency. This includes knowledge of the extent to which a variation in personal data taken into account would have led to a different result.

    Updates from Germany

    1. German court requires end-to-end encryption for B2C Invoices sent by emails (OLG Schleswig, 12 U 9/24)

    In a decision of 18 December 2024, the German Court of Appeal in Schleswig set a new standard, requiring end-to-end encryption where companies send B2C invoices, deferring to the need for technical and organisational measures to protect consumers against fraudulent third-party access (Art. 32 GDPR). The practical implications could be significant as end-to-end encryption is not yet a standard practice in B2C communication.

    Updates for Spain

    1. AEPD fines Orange EUR 1,200,000 for ID verification failures

    The Spanish data protection authority (“AEPD”) imposed a fine of EUR 1,200,000 on ORANGE ESPAGNE, S.A.U. (“Orange”) following a complaint filed by a data subject regarding the unauthorized duplication of a SIM card. On November 15, 2022, the data subject discovered that their SIM card had been duplicated without proper identity verification, leading to identity theft and a financial loss of €9,000. The duplication was carried out by two agents at an Orange store in Madrid, who used their internal system credentials to activate the duplicate SIM card fraudulently.

    The AEPD concluded that Orange failed to implement adequate technical and organizational measures to ensure the protection of personal data and breached Article 6 of the GDPR (lawfulness of processing) because as the SIM card was duplicated without the claimant's consent or any other legal justification. In addition, the AEPD found Orange failed to comply with Article 25 of the GDPR (data protection by design and by default) of the GDPR because the measures in place were not sufficient to prevent the fraudulent duplication of SIM cards.

    The decision serves as a reminder for organisatons about the importance of having robust measures and documented processes in place to protect personal data and prevent identity theft.

    Spotlight

    On 30 January, we hosted our annual data protection round-up event at our London office with speakers from our UK, French, German, Spanish and Italian teams. Many of our clients were in attendance to hear our review of the key data protection and privacy developments from the UK and EU in the past 12 months as well as our predictions for the year ahead.

    We have summarised below the four key takeaways from our multi-disciplinary data team who spoke at the event. If you would like further details on any of the points discussed below, please get in touch or view our slides below. 

    1. Cyber security and cyber resilience: in the last 12 months, we have seen increasing focus on cyber security and cyber resilience, for example in the UK the ICO has been engaging with organisations who may be subject to the NIS regime as RDSPs and the UK government, following a consultation, is preparing a cyber resilience bill and consulting on a new ransomware reporting regime. There has also been an increase in enforcement actions taken in the UK and Ireland relating to personal data breaches. In the EU, there is a suite of cyber legislation on the horizon including the national implementation of NIS2 which will continue bringing with it increased obligations and prospective of large financial penalties.

    What does this mean? Given the fast pace of new legislation in this area, businesses should stay informed on regulatory changes and relevant data protection regulator guidance. Businesses should also develop internal cyber readiness strategies and ensure relevant procedures and internal measures are up-to-date and accurate.

    2. Heightened expectations around accountability documentation: there has been increased enforcement activity in the EU for controllers of personal data where accountability documentation (e.g., ROPAs, DPIAs, and LIAs) is not in place, especially if that documentation is required due to high-risk processing, for example businesses which process personal data using AI. Our experience is that accountability documentation is one of the first things a data protection regulator will ask for when engaging with businesses.

    What does this mean? Up-to-date, complete, and accurate accountability documentation is one of the best tools a business has to mitigate data protection risks. Businesses should therefore consider dusting off any outdated accountability documentation and/or ensuring compliance gaps are plugged by putting in place any documentation which is currently missing.

    3. Focus on supply chain responsibilities: over the last few years a complex web of regulation has created overlapping obligations on various participants in supply chains. Therefore, guidance from the EDPB which clarifies processor and sub-processor obligations and which sets out the regulators' expectations of a risk-based approach to conducting sub-processor due diligence was a welcome end to 2024. We discuss this further in our October 2024 data bytes.

    What does this mean? Businesses that are controllers of personal data should consider auditing their supply chains and review processes to ensure that they are fully documented, and up-to-date. Businesses that are processors of personal data should be prepared for more information requests from controllers / other processors in the supply chain and tighter contractual wording to comply with the EDPB's positions.

    4. Future of the regulatory environment: the current regulatory environment is complex, to say the least, but is also showing signs of fracturing. Global businesses are currently facing overlapping regulatory remits, divergent interpretation of data protection laws, and questions marks over longevity of the EU-US adequacy decisions. It is unsurprising, therefore, that pressure is being placed on both the EU and UK to promote growth and innovation, which has led to calls for 'de-regulation' and also several legislative initiatives in the EU being shelved such as the AI liability directive and the E-privacy regulation.

    What does this mean? As we look at the next 12 months, the future is uncertain. We will be closely following regulatory developments – if you have any questions, please reach out.

    Data Protection Round Up 2024

    PDF 8.29 MB

    Download Report

    The information provided is not intended to be a comprehensive review of all developments in the law and practice, or to cover all aspects of those referred to.
    Readers should take legal advice before applying it to specific issues or transactions.