Data revolutions and privacy scares
In-house counsel have been kept on their toes over the past year by numerous global data protection and privacy challenges, least among them the growth of artificial intelligence. In-House Perspective takes stock of what’s on the agenda.
The EU’s General Data Protection Regulation (GDPR) has become the international standard on privacy. Yet in the six years since its implementation, the global data protection landscape hasn’t necessarily become easier to navigate. The challenges posed by new technologies, the cybersecurity threats inherent to a digital economy and evolving legal requirements mean that the debate about how best to navigate and protect citizens’ data continues.
‘The global privacy landscape is very complex,’ says Elisa Henry, Vice Chair of the IBA Technology Law Committee and Director of Global Privacy at WSP Global in the Netherlands. ‘To be properly navigated, a good understanding of the underlying technology and a true risk-based approach are necessary, since total compliance with all privacy laws is extremely complicated to achieve and requires true specialists, dedicated to this particular area of law.’
The coming revolution
The past year has seen artificial intelligence (AI) dominate the global data protection and privacy landscape. As AI has evolved, presenting both opportunities and risks, companies, governments and regulators have responded.
Given that AI technology is trained on huge amounts of data, including personal data, the output of AI models will have a bearing on the privacy and data protection rights of individuals. As a result, in-house lawyers will have to get to grips with the interplay between data privacy and AI laws, as well as their own internal company policies on the use of the technology. Henry explains that a top priority for in-house counsel is quickly adapting compliance frameworks to reflect the specific challenges AI brings.
The AI revolution has dominated the privacy discussion in the past year, says Matthias Orthwein, a partner at SKW Schwarz in Munich. He outlines the issues involved, which include the use of personal data for training AI models and systems; its use for automatic or automatically supported decision making; potential infringements of privacy rules caused by the output of AI applications; the need for more explainable AI in order to fulfil the legal rights of data subjects to transparency and control of the use of their personal data; and the use of AI tools for GDPR compliance work. ‘All of these discussions have primarily evolved since December 2022 and during the past year,’ explains Orthwein.
In terms of how legislators and regulators are responding to AI, the EU Artificial Intelligence Act, which aims to foster the responsible development and deployment of AI in a manner that respects fundamental rights, entered into force on 1 August 2024. Several years in the making, it’s considered to be the first piece of AI-focused legislation in the world. The EU Council believes that the legislation is capable of setting a new global standard for AI regulation, while promoting the European approach to tech regulation on a global scale.
The AI Act seeks to address potential risks to citizens’ fundamental rights, including the rights to privacy and to the protection of personal data, by setting out obligations concerning specific uses of AI and the requirements placed on providers, importers and users of such systems related to the level of risk posed.
Elsewhere, in July Brazil’s National Data Protection Authority ordered Meta to temporarily suspend the training of its AI models with the personal data of Brazilian users, after the regulator found preliminary indications of violations of the country’s General Personal Data Protection Law. In response, Meta described the Authority’s decision as ‘a step backwards for innovation, competition in AI development and further delays bringing the benefits of AI to people in Brazil the company’. It added that its approach complies with local privacy laws.
Also in July, the European Data Protection Board (EDPB) adopted Statement 3/2024, which relates to the role of data protection authorities in the AI Act framework. It makes clear that EU data protection law is fully applicable to the processing of personal data involved in the lifecycle of AI systems and that the EDPB has already begun an examination of AI’s interplay with EU data protection law. To develop an enforcement framework, the EDPB requests that the national data protection authorities (DPAs) of each Member State are also designated as the competent national authorities within the meaning of the AI Act, given their experience and expertise in developing guidelines and best practices and carrying out enforcement actions on AI-related issues with respect to the processing of personal data at both national and international level.
More specifically, the EDPB recommends that DPAs should be designated by Member States as market surveillance authorities (MSAs) within the meaning of the AI Act for the high-risk AI systems mentioned in Article 74(8) of the legislation, as well as for those listed in its Annex III. The EDPB Statement also highlights the need for the EU AI Office, which is part of the European Commission, to cooperate with national DPAs and the EDPB on issues relating to the processing of personal data.
Lisandro Frene, Chair of the IBA Platforms, E-Commerce and Social Media Subcommittee and a partner at Richards, Cardinal, Tutzer, Zabala & Zaefferer in Buenos Aires, says that all companies are technology companies now and that data is the fuel for such technology. ‘Thus, in-house counsel will have to look beyond data privacy laws and start diving into other fields of law that combine with data privacy laws when dealing with data in particular industries,’ he says.
“In-house counsel will have to start diving into other fields of law that combine with data privacy laws when dealing with data in particular industries
Lisandro Frene
Chair, IBA Platforms, E-Commerce and Social Media Subcommittee
This view is reiterated by the EDPB’s Statement 3/2024, where it says, ‘in fact, the processing of personal data (which is often strictly intertwined with non-personal data) along the lifecycle of AI systems ‒ and particularly along the lifecycle of those AI systems presenting a high risk to fundamental rights ‒ clearly is (and will continue to be) a core element of the various technologies covered under the umbrella of the AI definition, as enshrined in Article 3(1) AI Act.’
Picking up the Shield
Another notable privacy-related development is the adoption of the new EU–US Data Privacy Framework to normalise transfers of personal data between the two jurisdictions. The Framework was created to rectify issues identified by the Court of Justice of the European Union (CJEU) when it declared the European Commission’s Privacy Shield invalid in a ruling in 2020.
The European Commission finalised its much-anticipated Implementing Decision pursuant to Regulation (EU) 2016/679 on the adequacy of data privacy arrangements between the EU and the US in summer 2023, specifically addressing the shortcomings in protection against surveillance by US intelligence and providing judicial redress for EU residents. Notably, the new Framework places limits on the ability of US intelligence agencies to access the data of EU citizens when it’s transferred to the US and provides EU citizens with scope to raise complaints about the way their data is handled before a data protection review court.
The EDPB published an information note on data transfers to the US under the GDPR after the adoption of the adequacy decision, seeking to provide clarity on the implications for data subjects in the EU and entities transferring data from the bloc. The information note addresses five broad questions on the practicalities of the new regime and the redress mechanism, as well as details of the first review of the effectiveness and implementation of the decision a year after its entry into force.
‘Luckily, we have gained the final and long-awaited piece of clarity and assurance for users and providers of cloud-based solutions that privacy is not meant to block innovation and new business models,’ says Orthwein. ‘The introduction of the EU–US Data Privacy Framework has cleared up the issues and ended a lot of discussions pertaining to the transfer of personal data to the US.’ He adds that while there are still a number of questions to be answered, the debate has substantially calmed down.
“The introduction of the EU–US Data Privacy Framework has cleared up the issues and ended a lot of discussions pertaining to the transfer of personal data to the US
Matthias Orthwein
Partner, SKW Schwarz
Cybersecurity and other tales
With numerous elections taking place around the world in 2024, the interplay between democracy and data protection and cybersecurity measures has been highlighted. In July, the UK Information Commissioner’s Office (ICO) issued a reprimand to the Electoral Commission for an incident that took place in 2021, in which hackers gained access to servers containing the personal data of approximately 40 million people. The ICO’s statement explains that the hackers had access to the data for over a year, after exploiting a known software vulnerability in the Electoral Commission’s Microsoft Exchange Server that had not been secured. The ICO’s investigation found that the Electoral Commission didn’t have ‘appropriate security measures in place’ to protect the personal information it held, didn’t ensure that the latest security updates were implemented and didn’t have sufficient password policies in place at the time of the attack. The Electoral Commission, the ICO added, has undertaken numerous remedial steps to improve their security since, including implementing a plan to modernise their infrastructure, as well as password policy controls and multi-factor authentication for all users.
‘Cybersecurity incidents are affecting almost all companies worldwide,’ says Frene. ‘You have the hacked companies and the ones that will be hacked: it is almost a certainty that it will happen to your company. The position of in-house counsel is crucial because, when a cybersecurity incident happens, they have to decide the course of action in a matter of hours.’
“The position of in-house counsel is crucial because, when a cybersecurity incident happens, they have to decide the course of action in a matter of hours
Lisandro Frene
Chair, IBA Platforms, E-Commerce and Social Media Subcommittee
The ongoing threats faced by companies and governments are relentless and increasing in number and sophistication. In the UK in July, the new government announced during the King’s Speech – which opens parliament – a Cyber Security and Resilience Bill, which aims to strengthen the UK’s defences against cyber threats and keep critical infrastructure and digital services secure. The background notes to the King’s Speech highlight that in the last 18 months, hospitals, universities, local authorities, democratic institutions and government departments have been targeted in cyberattacks.
Also in the UK, the country’s Data Protection and Digital Information (DPDI) Bill, which was due to update the UK GDPR and the Data Protection Act 2018, had been nearing the final stage of its passage through Parliament. However, the draft legislation didn’t pass during the two-day ‘wash-up’ period, which enables the government to push through the bills it deems essential before Parliament is dissolved in readiness for a general election. The DPDI Bill was intended to simplify obligations, in what was believed to be a move away from the EU’s more onerous requirements on data protection. Notably, the introduction of ‘smart data’ provisions that would open up consumer data flows in various sectors was proposed.
The future of the DPDI Bill now rests with the new government, who had been opposed to many of the Bill’s provisions, and its absence from the King’s Speech is significant.
On the horizon for businesses and in-house counsel is the EU’s Data Act, which will apply as of 12 September 2025. It aims to improve access to data in the EU market for individuals and businesses. Significantly, the Act will enable the public sector to access and use data held by private industry to help respond to emergencies. Another key provision involves additional protection for European businesses from unfair contractual terms in data sharing contracts. Orthwein explains that ‘the use and sharing of product related data (which will include personal data as well as non-personal data) will need to be considered and caution should be taken with regard to the design of future products and services’.
The EU’s Digital Operational Resilience Act (DORA), meanwhile, will apply as of 17 January 2025 and is aimed at bolstering the IT security of financial entities. In addition to this, EU Member States must implement Directive (EU) 2022/2555 on measures for a high common level of cybersecurity across the Union (known as ‘NIS2’) by the end of 2024. NIS2 aims to improve cybersecurity including by increasing the level of harmonisation of security requirements and reporting obligations; encouraging Member States to introduce new areas of interest such as supply chain, vulnerability management, core internet and cyber hygiene into their national cybersecurity strategies; and expanding the types of sectors covered, meaning that more entities will be obliged to take measures to bolster their cybersecurity. These new rules will increase the pressure on data controllers to improve the security of personal data and will require in-house counsel to make the relevant changes within their company’s processes and documentation.
‘The new regulations, such as the EU Data Act, AI Act and the IT security regulations (DORA and NIS2) are very much linked to the compliant use of personal data,’ says Orthwein. ‘Unfortunately, the wording and concepts in these new regulations are not aligned with the privacy requirements in the GDPR. This creates a number of uncertainties, unclear and even opposing requirements, as well as room for misunderstandings. Privacy professionals will continue to be very busy trying to keep track of all the requirements.’
Zeroing in on key concerns
For Frene, the key issues related to data protection faced by in-house counsel currently are cybersecurity, employee data and AI. ‘Employee data privacy is particularly important at present,’ he says. ‘New technologies processing employees’ biometric data and pre-existing technologies like video surveillance are being used as never before to monitor employees’ behaviour. This forces in-house lawyers to deal with data protection law, labour law and sometimes with other laws that regulate these technologies.’
A particular challenge is presented by the collection and use by companies of employee data, for example where technological devices process this data and monitor the behaviour of staff. ‘In-house counsel are often in a unique and uncomfortable position,’ says Frene. ‘On one hand, they need to defend and provide grounds for the company concerning the use of such technologies’, while, on the other hand, there are many applicable data privacy and labour laws that prevent or restrict the use of such technologies, particularly in instances where they affect and/or invade the privacy and/or other related rights of employee, he explains.
For Henry, a significant challenge comes from implementing a true privacy by default and by design approach and, generally, developing and implementing new processes related to privacy, which aren’t always seen as a priority for many businesses operating in the business-to-business (B2B) space. ‘Coordinating with multiple stakeholders internally and externally, who do not always have a clear understanding of how important privacy is for the organisation and see it as an impediment to quick innovation/business operations, can also be difficult,’ she adds. ‘Finally, in the B2B space, securing the necessary budget to automate compliance efforts and to retain external counsel can be tricky, as well.’
“In the B2B space, securing the necessary budget to automate compliance efforts and to retain external counsel can be tricky
Elisa Henry
Vice Chair, IBA Technology Law Committee
Sophie Cameron is a freelance journalist and can be contacted at sophiecameron2@googlemail.com