Data Bytes 40: Your UK and European Data Privacy update for September 2023
04 October 2023
04 October 2023
Welcome to our September edition of Data Bytes, where the Ashurst UK and European Data Privacy and Cyber Security Team look to summarise the key privacy legal and policy developments of the previous month.
Dominating the privacy headlines this month is the news that companies in the UK finally have, in the words of Simon, Garfunkel and many privacy commentators, a "bridge over troubled waters", allowing compliant transfers of personal data to the US when the recipients are certified with the UK extension of the EU's Data Privacy Framework (DPF). Meanwhile in Europe, the challenges to the DPF have already begun, courtesy not only of Max Schrems, who announced his intention to challenge back in July, but also a French member of parliament, who is challenging it in his personal capacity. The ICO itself, in welcoming the UK-US Bridge, has also expressed concerns about the protections offered to UK citizens by US law and asked the UK Secretary of State to keep this under review. With all this in mind, itthis is unlikely to be the end of the story for ensuring compliance of transfers of personal data to the US.
Staying on the theme of international transfers, our spotlight section this month looks at the proposed new Indian data protection laws. We asked Samuel Mani, partner at Indian law firm Mani Chengappa & Mathur, what these proposed laws will mean for those companies transferring personal data to India and whether it's time to dust off those transfer risk assessments for Indian transfers and update them with some more positive news. Get your byte sized updates here.
On 21 September 2023, the UK government laid before parliament the Data Protection (Adequacy) (United States of America) Regulations 2023 (SI 2023/1028) which will enter into force on 12 October. The Regulations are an adequacy decision for the US, known as the UK-US Data Bridge, which allows UK organisations to transfer personal data to an organisation in the US listed on the DPF as participating in the UK extension to the DPF without further protections like standard contractual clauses in place. This is very welcome as for two months, the UK has been lagging behind Europe, whose agreement with the US has been in place since July.
The ICO has highlighted in an opinion four areas of risk with the UK-US Data Bridge. In particular, the ICO noted that not all categories of special category data as defined under the UK GDPR are explicitly covered under the definition of "sensitive information" and should be afforded additional protections under the UK-US Data Bridge. In practice, this means that UK organisations will need to identify genetic, biometric, sexual orientation data as “sensitive information” when sending it to US-certified organisations so that it is offered the most appropriate protection. The opinion welcomes the recommendation of The Department for Science, Innovation and Technology (DSIT) to provide some guidance for organisations on this point.
The remaining three areas of risk highlighted by the ICO relate to a number or rights and protections offered to data subjects under UK law which are not offered under US law (the right not to be subject to decisions based on solely automated decision making, the Rehabilitation of Offenders legislation which places limits on the processing of spent convictions, the right to be forgotten and the right to withdraw consent). No practical solutions to these are offered by the ICO, who have recommended that the Secretary of State keep these risks under review.
The practical consequences of the UK-US Data Bridge are that from 12 October, organisations subject to the UK GDPR will be able to transfer personal data to US companies certified under the UK extension of the DPF without having to implement additional safeguards such as the standard contractual clauses and completing a full transfer risk assessment. Organisations can check whether a US data recipient participates in the DPF and the UK extension by searching the list at www.dataprivacyframework.gov. From an administration perspective, if UK organisations start to rely on the UK-US Data Bridge for transfers they will likely need to update privacy notices, records of processing activity and/or internal policies to take account of this.
Due to the tumultuous history of previous incarnations of the Data Bridge (Safe Harbor, Privacy Shield) we anticipate reluctance from organisations in the UK on relying on it without a back-up plan. Many organisations are likely to implement into their contracts triggers whereby alternative mechanisms for data transfers (standard contractual causes) will spring into life, if the UK-US Data Bridge is brought down, with obligations on US importers to assist in production of transfer risk assessments.
The UK government published a statutory instrument, the Data Protection (Fundamental Rights and Freedoms) (Amendment) Regulations 2023, to amend references in the UK GDPR and the UK DPA 2018 to "fundamental rights and freedoms" to refer to ECHR rights within the meaning of the Human Rights Act 1998, rather than retained EU law rights; the change will take effect January 2024.
The SI is in recognition of the fact that retained EU law rights will not be recognised in UK law after the end of December 2023. Whether this will have a meaningful and detrimental impact on data protection rights from January is a moot point: there is no direct equivalent to the right to the protection of personal data in UK Law (albeit it falls within the right to respect for private and family life under Article 8 of the European Convention of Human Rights, which is enshrined in UK law by the Human Rights Act 1998). For the time being however, this is an administrative drafting point, rather than any substantial change in law and not something for a data protection practitioner's immediate worry list.
On 12 September the ICO signed a memorandum of understanding (MoU) with the National Cyber Security Centre (NCSC) setting out how NCSC and the ICO plan to cooperate with one another. When organisations are hit by a cyber attack, it is currently voluntary whether they choose to notify the NCSC and there is no clear benefits to doing so, beyond "doing the right thing". One of the key elements of the MoU is a commitment by the ICO to incentivise engagement with the NCSC, including recognising organisations affected by significant cyber incidents that report to and work with the NCSC. The ICO states that it will consider whether it can be more specific on how such engagement might factor into the calculation of regulatory fines.
It remains to be seen whether the ICO will release specific guidance on this point, for example by updating its regulatory action policy, or will instead specify in press releases and monetary penalty notices if fines are reduced due to proactive engagement with NCSC. Organisations should consider whether their cyber response playbooks and procedures include steps to engage with law enforcement where appropriate in connection with certain cyber incidents.
After a recent ICO poll revealed that women were concerned about the security of their data on period and fertility tracking apps, the ICO announced that it would undertake a review of the most popular apps to investigate how they process users' personal information. Other concerns included the noticeable increase in targeted baby/fertility related adverts since signing up with the apps (17% described receiving these adverts as distressing) and potential lack of transparency over usage of their data.
The ICO is now conducting a user survey and contacting popular apps to identify if there is the potential for harm and negative impact on users.
This is a reminder of the importance that individuals and the ICO places on security practices and transparency of processing practices, particularly when special category data, namely health data, is involved in conjunction with marketing practices. It is also demonstrative of public awareness of targeted advertising – an area which is receiving more and more scrutiny by regulators. If you intend to undertake targeted advertising, be sure to carry out a DPIA and consider the potential for harm and negative impact on users.
On 31 August, the UK House of Commons, Science, Innovation and Technology Committee published an interim report setting out its findings to date from its inquiry on governance of AI. Whilst the report welcomes the AI White Paper published in August 2023, it finds the proposed approach to be "already risking falling behind the pace of development of AI… our view is that a tightly-focussed AI Bill in the next King’s Speech would help, not hinder, the Prime Minister’s ambition to position the UK as an AI governance leader" .
Ultimately the report recommends the immediate implementation of a regulatory regime for AI (namely for legislation to be put to Parliament during its next session in November); failure to do so would hinder the UK in its positioning as a centre of AI research and practice and leader of world thinking and practice on AI governance. The report also urges greater international cooperation to address these 12 challenges, including at the upcoming Bletchley Park AI summit taking place in November.
The report also highlights 12 essential challenges of AI governance. One of these is the "Privacy" challenge, described by the report as the use of AI to identify individuals and use their personal information "in ways beyond what the public wants". There is also the "Black Box" challenge, i.e. that some AI models and tools cannot explain why they produce a particular result in a challenge to transparency requirements.
This month Ashurst is holding a roundtable on "What role do privacy professionals play in the governance of AI?" and we promise to feedback themes and discussions from this in our next edition of Data Bytes.
Thinking of using the 'Bcc' feature to send out your organisation's next mass email? Think again. On 30 August the ICO issued a warning to organisations when using 'Bcc' for bulk email. Failure to use it correctly is frequently among the top 10 non-cyber breaches reported to the ICO each year.
Whilst we have all been accustomed to using 'Bcc', the ICO suggests using alternatives such as bulk email services, mail merge or secure data transfer services; the warning also accompanied new guidance on email security.
As best practice, we recommend that this guidance is shared with relevant stakeholders such as your Info Sec, marketing and client-facing teams to ensure that your internal practices and methods of communication have been adequately risk assessed and deemed secure. To minimise the risk of any data breaches, we further recommend that any onboarding and data protection training should be updated to remind colleagues of (i) security risks that can arise in relation to email communications and (ii) your organisation's approach to sending sensitive information including special category data by email.
On 31 August 2023, the ICO published guidance addressing data protection obligations where employers are handling the health data of its workers. The guidance is divided into two parts, with the first part providing an overview on compliance basics such as purpose of processing, transparency, lawful basis and applicable conditions under Article 9 UK GDPR. The second part contains deep dives and checklists about common processing scenarios involving workers' health data such as handling sickness and injury records, occupational health schemes, health monitoring and the sharing of worker health data.
To help organisations understand the law and good practice in this area, the ICO specifies throughout the guidance the legislative requirements that must be complied with, the good practice which should be complied with unless there is a reason not to and suggested options for compliance which could be considered. This signposting of different regulatory expectations is a new approach for the ICO and we expect a similar format to be used in future guidance released by the ICO.
On 1 September 2023, the Irish Data Protection Authority (DPC) issued its final decision against TikTok Technology Limited (TTL) for the illegal processing of personal data collected through user accounts of children. The DPC issued a fine over €345 million and ordered TTL to adjust its data processes within a three-month timeline. The administrative fine follows one by the ICO of over £12.7 million in April 2023 for the unlawful processing of children's data for those aged under 13. In the current case, the DPC imposed the fine as lead supervisory authority for TTL's operations in the EU. It held that TTL – in a six-month period from July to December 2020 – had failed to provide sufficient privacy information, effective age verification measures and had manipulated user choices by implementing dark patterns. In its decision, the DPC criticised certain platform settings by TikTok, including the weak “Family Pairing” features and making children's accounts public by default. The fine is in line with more stringent enforcement actions by the DPC against social media providers: in April 2023, the DPC issued a fine of €1.2 billion against Meta for the illegal processing of user data and earlier in September 2022, the DPC issued a €405 million fine against Instagram.
On 30 August 2023, the German Federal Government issued its new data strategy paper "Progress through Data Usage", describing a series of legislative and further actions building on the EU digital economy legislation for the period until end of 2024.
The government accentuated the need to increase the availability of open data on its GovData.de portal, including the standardisation of metadata for easier accessibility, improving the enforceability of open data access rights against government institutions at all levels through a Federal Transparency Act, as well as providing more data about public infrastructure, enacting a Mobility Data Act for better availability and quality of data regarding travel information and traffic infrastructure, better data quality, and clear data provision and usage rules for government and private sector stakeholders. Government institutions shall follow the "FAIR principles" ("findable", “accessible”, "interoperable" and "re-usable").
Publicly funded research and development projects shall generally be subject to increased transparency obligations, by making available the data collected and obtained in research and science to the wider public (including raw data, processed data and metadata), in alignment with an EU-wide standard for making available such data in a largely automated collection of metadata. The federal government will legislate a Research Data Act in order to improve access to and usability of health data, including greater ease in using pseudonymised data in protected environments based on opt-out settings, as well as issuing guidelines on the anonymisation and pseudonymisation of personal data for research purposes and sector-specific research to facilitate the cross-combination and correlation of data for research purposes – in addition to the general guidelines on anonymisation and pseudonymisation that the EDPB has announced a while ago, but which are still outstanding unfortunately.
The federal government will also amend German data protection law to formally institutionalise the Data Protection Conference (the working group of the Federal and Laender data protection authorities), and improve the alignment of the various data protection authorities. Increasing the notion of opt-out settings and putting further emphasis on the self-regulatory activities of the data protection officers are all part of the intended measures within the framework.
The federal government emphasised the need to improve data sharing, such as through the planned European Health Data Space, as well as in other data spaces such as Catena-X and Manufacturing-X, which are currently being established.
Lastly, the Federal Government intends to pass legislation on its Federal Employee Data Protection Act by the end of 2023.
Overall, the package of measures promises a massive boost for the digital economy in Germany and, quite likely, may trigger follower-on effects in other EU jurisdictions.
On 18 September 2023, the European Commission (EC) and China held their second "High-level Digital Dialogue" in which they discussed issues regarding data regulation, artificial intelligence and the cross-border flow of industrial data. Both parties agreed an action plan with a particular focus on the safety of products sold online, fostering consumer protection as well as relaunching the Information Communication Technologies (ICT) dialogue. The EC emphasised the importance of the ethical use of AI technologies in full respect of universal human rights and urged the Chinese authorities to ensure a fair, reciprocity-based business environment in the digital field. In particular, it expressed concern about recent Chinese legislation that prevents EU companies in China from using its industrial data.
On 18 September 2023, the Danish data protection authority (Datatilsynet) published guidance to prevent civil servants and other employees of public authorities from accessing the personal data of citizens without proper authorisation. The guidance is intended to underpin the general public's confidence in the legality and proper handling of data processing by public authorities. Datatilsynet requires public authorities to conduct a risk assessment and to determine the necessary security measures before assigning access rights to their members of staff. Additionally, the Datatilsynet requires those public authorities to continuously log their employees' use of IT systems, and maintain controls to detect abusive conduct by such employees. The Datatilsynet emphasises the need to inform those employees about the existing control measures and possible enforcement actions and disciplinary measures that could follow from not observing the guidance.
On 6 September 2023, Mr. Latombe, a Member of the French Parliament and civil servant at the French Data Protection Authority (CNIL) instituted proceedings under Art. 263 para. 4 TFEU directly with the Court of Justice of the European Union (CJEU) for annulment of the EU-US Data Privacy Framework. Mr Latombe said he was acting "in personal capacity, as a simple citizen of the Union, and not as a French MP, Law Commissioner or CNIL Commissioner." Mr. Latombe criticised the lack of guarantees for effective remedy and the lack of transparency of the newly created Data Protection Review Court (DPRC). Furthermore, he highlighted infringements of GDPR principles: in his view, the principles of data minimisation and proportionality are violated by bulk data collection of the US intelligence services. He also criticised the lack of translation of the Privacy Framework into EU national languages. The CJEU decision is considered to be a cornerstone in the CJEU's case law with a crucial impact on transatlantic data transfers.
We asked Samuel Mani, partner at Indian law firm Mani Chengappa & Mathur, some questions around the potential impact of the proposed Digital Personal Data Protection Act on companies based outside India and those international organisations with establishments in India.
Does the Digital Personal Data Protection Act have any extra territorial effect and if so what are the key trigger points?
Yes, the Digital Personal Data Protection Act has extra territorial reach. The Act applies to the processing of digital personal data outside of India, if the processing is in connection with any activity related to offering of goods or services to Data Principals who are in India.
What are the practical changes that organisations with establishments in India will need to implement?
For the most part, the Digital Personal Data Protection Act casts the obligation of compliance on "Data Fiduciaries".
Data Fiduciaries will have to take steps to demonstrate that they have implemented measures to comply with the provisions of the Act. Such measures will include adopting data privacy policies that specifically cover the requirements of the Act (including the requirement to take reasonable security safeguards) and ensuring that compliance with the policies is tracked and periodically audited. It will also help larger Data Fiduciaries establish functions within their organisation which are tasked with administering the requirements of the Act.
Data Fiduciaries will also have to ensure that they conduct adequate data privacy due diligence of the Data Processors they appoint and will have to contractually flow down compliance obligations to the Data Processors.
Our clients have been obliged to undertake transfer risk assessments in relation to transfers of personal data to outsourced providers in India. How will these new laws impact those assessments?
All existing transfer risk assessments would have been conducted in relation to the existing data privacy regime in India, which is very bare bones. The new Act contains definitive and comprehensive compliance requirements. For these reasons, we believe that the existing data transfer risk assessments will have to be updated. The bottom line is that the new Act should instill a lot more confidence to anyone looking to transfer personal data to India.
Author: Samuel Mani, Partner, Mani Chengappa & Mathur.
Authors: Rhiannon Webster, Partner; Alexander Duisberg, Partner; Shehana Cameron-Perera, Senior Associate; Tom Brookes, Associate; David Plischka, Associate; Emily Jones, Solicitor
The information provided is not intended to be a comprehensive review of all developments in the law and practice, or to cover all aspects of those referred to.
Readers should take legal advice before applying it to specific issues or transactions.