Do you agree with the analysis presented in the call for evidence?
Short answer
Yes we do. This call for evidence is clearer and more accurately addresses legal and conceptual issues than the others which featured in this series. There are still some areas of inconsistency and a number of gaps which our response below address. We argue that the law should be modified to create a new designation of controller specifically for the Gen AI ecosystem.
More detailed explanation and analysis
We agree with the ICO’s analysis in many respects. It is accurate, sensible and methodical. For the most part we do not challenge the ICO’s analysis.
There are some imperfections that should be addressed.
Joint Controllership
The ICO’s call to evidence accurately states that not all organisations frame their relationships correctly, identifying themselves as controller or processor when actually they are more like joint controller. This is particularly acute in the Gen AI ecosystem.
Our primary concern regarding the joint controller designation in the Gen AI ecosystem relates to the practical implementation of joint controllership and the carving up of the associated responsibilities between developers and deployers on the one hand, and the myriad other entities in the chain. The ICO’s call to evidence mentions that the parties “…must clearly set out their respective roles and responsibilities for each processing activity by means of an arrangement” but this process has the potential, in some circumstances, to be problematic or even unworkable in practice. In particular, whether closed-source or open-source, we think deployers are likely to have a limited understanding of the technology underlying the models they adopt as well as little control over how it functions, making it difficult to accurately recognise their responsibilities and articulate these into a joint controller statement. When coupled with the fact that the balance of power will, in many cases, be in favour of the developer, and the responsibilities of the deployer are likely to have been pre-determined and not capable of meaningful negotiation, a designation of independent controllers may, in some circumstances, be more appropriate.
In addition to the above, we believe that there is a potential for conflict between the contents of the ICO’s analysis and the European Data Protection Board (“EDPB”) guidelines 07/2020 on the concepts of controller and processor in the GDPR. The EDPB’s guidance stipulates at paragraph 34 that, in order to be a controller, the relevant party “…must decide on both purpose and means of the processing…”. The EDPB guidance says at paragraph 31 that the dictionary definition of the term “means” has been adopted for the purposes of the Board’s analysis, which is “how a result is obtained or an end is achieved”. If this is correct then, in the majority of circumstances, it is unlikely that a deployer of AI will have a significant degree of influence over the means of processing and cannot, in most instances, be considered a controller. Where this is the case, it is possible that neither joint controllership nor independent controllership will be appropriate. It may be the case that a new designation would be more appropriate on which see below.
The designation of joint controller also gives rise to potential joint liability and accountability issues which could be undesirable in the Gen AI ecosystem, especially when combined with significant power imbalances (see below). Where there is a material imbalance in ability to make decisions (i.e. company A make more decisions than company B), we feel independent controllers may be a better fit.
A Modified or New Designation Specifically For Gen AI
In several consultation responses for the ICO we have made the point that the law which the ICO is seeking to apply was not designed with Gen AI in mind. This cannot be stressed enough and its effect is significant. We believe it is right to question whether the designations of controller, processor and joint controller work in the context of Gen AI and if they do not, the ICO and the Law Commission should consider a revision to existing data protection laws in the United Kingdom to address shortcomings and gaps.
The consultation group considered this issue in detail. Our analysis can be summarised here:
- Part 1: Unworkable Duties: the starting point is to acknowledge that the GDPR controller and joint controller obligations are extensive and are likely to be unworkable in complex Gen AI ecosystems. An example of this would include the duty on each controller to inform data subjects (Articles 13 and 14); rights of erasure (Article 17) and rights to restrict processing (Article 18) all of which contain significant complexity in Gen AI models. This is by no means easier where the entities are joint controllers. We estimate that there may be in excess of 10 to 20 different entities between development and deployment of the more popular Gen AI models, although some contributors believe this may be a significant underestimation. If, as the ICO’s call to evidence insinuates, those entities are joint controllers, there is little likelihood that they will be able to honour the requirements described above vis-à-vis data subjects, nor are they likely to be able to comply with the duties in Article 26 regarding joint controller duties due to the imbalance of power between the parties that make up the Gen AI ecosystem.
- Part 2: Modifications to Designation Types: the next part of the analysis is to recognise that the controller duties could still be workable with modification to help make them fit for purpose in the Gen AI ecosystem. Modifications could arrive from a change in the law or from a flexible, context-based approach from the ICO when exercising its regulatory duties. That flexibility has been seen in previous ICO investigations; it is also noticeably absent in many CJEU decisions on this topic.
- Part 3: Articulation of Modifications: the next part of the analysis, if parts 1 and 2 are agreed upon, is to properly identify and articulate the extent of the modifications to the controller/joint controller duties. This could include diluting or entirely removing certain data subject rights identified above for Gen AI providers which play a small part in the ecosystem. Alternatively, it could arise from the ICO’s guidance clarifying that it will take a pragmatic approach to understanding how those providers operate and how they manage their compliance duties.
- Part 4: A New Designation: if part 3 is not achievable, then the case must be made for creation of a new designation for Gen AI providers which offers adequate protection to data subjects whilst removing outdated, unworkable and ineffective regulatory burdens for controllers and joint controllers in this emerging AI technological environment. We believe this could be a “minority controller” (or alternative name) which operates as a half-way house between controller and processor, where:
(a) the minority controller is not required to meet all of the Chapter III GDPR duties, but must at the same time be able to comply with the essence of the principles in Article 5. The detail of this may require further discussion and debate;
(b) the minority controller remains responsible for its actions and omissions for personal data within its control, but its duty to identify a purpose is modified such that it forms part of the overall Gen AI system to which it is contributing;
(c) the minority controller may have a new lawful basis of processing, being “development of an artificial intelligence system for training or pre-deployment operations as part of a recognised technology infrastructure”;
(d) in exchange for the dilution of duties described above, the minority controller should have enhanced reporting obligations to the ICO. The aim here is to elicit as much information as possible from that entity and to share that in the format of a public register, giving greater visibility to the functions of each entity within the supply chain;
(e) the minority controller has a defence to standard sanctions (per Chapter VIII GDPR) if it has relied on the compliance steps of a Gen AI provider or deployer; and
(f) the designation of minority controller cannot apply to a Gen AI provider. It should also not apply to a deployer where that deployer is providing services that have a significant legal effect on data subjects such as health services.
Security Measures and Risk Mitigation
The ICO’s call to evidence makes the point that “deployers are unlikely to have sufficient influence or control if they are unable to change or understand the decisions behind the processing” (emphasis added). This potentially provides entities that may be controllers with an excuse to avoid compliance. If an entity does not understand a model, that does not mean that it cannot later develop such understanding. If an entity decides to make use of any AI, it should be held accountable for its subsequent deployment and use, including developing sufficient knowledge, skills and understanding of the model itself.
Power Imbalance With Small Or Micro Entities.
We are concerned about significant power imbalances between the very large (predominately) US technology companies in the Gen AI field and their interaction with small or micro entities. Our consultation group posed the question: are smaller entities truly equipped, do they have enough power to understand and negotiate contracts to ensure compliance and do they have the correct classification on whether they are controllers, joint controllers or processors? We appreciate that factors such as this were not persuasive in Court of Justice of the European Union (“CJEU”) decisions such as Facebook & Fashion ID[1] but, in our opinion, the use of generative AI creates risks that distinguish the analysis in that case (a case in which the court arguably reached a wrong and unworkable decision). Ultimately it is not in anybody’s interests, including the ICO’s, to find that small or micro entities are unnecessarily exposed to risk under the UK GDPR with US tech-giants.
Unpicking or Unravelling Historic and Existing Gen AI
Given Gen AI is no longer new, how would the ICO propose that developers, deployers and users of existing Gen AI systems now ensure compliance regarding retrospective processing in light of guidance not in existence at the time of the processing activities? This is especially the case give older Gen AI models may have been used to train newer models.
Open Loop and Closed Loop Systems
Finally, we are of the opinion that any ICO guidance on this topic should distinguish between open and closed loop systems, with the former being those that utilise inputs for further training and the latter being those that do not. Such a distinction fundamentally changes the roles of developers and deployers as, in an open loop system, it is unlikely that a developer will ever be considered a processor. Support for this can be found from the CJEU decisions din the Fashion ID and the Jehovan Todistajat cases which concluded that gaining an economic advantage, and furthering the aims of a developer, respectively would constitute joint controllership.
We also suggest that open source models should have more RAIL (responsible AI licence of use) or usage monitoring, because organisations work as a controller to protect personal data usage. Open source GenAI without parameters would increase privacy risks on actors (users or external entities who interact with systems) and regulatory violations.