Out-Law News 3 min. read
29 Aug 2024, 3:52 pm
Developers of generative AI (gen-AI) models and the businesses that use them have been advised to consider and clarify their role in determining how personal data is processed by those tools.
The recommendation made by the UK’s data protection authority, the Information Commissioner’s Office (ICO), is particularly relevant to businesses buying “off the shelf” AI models for use in developing their own systems and services, according to data protection law expert Kathryn Wynn of Pinsent Masons. She said those businesses must ensure that they understand when and how they, rather than the developer, will be responsible as controller in relation to personal data within the model, in prompts used to interact with the model, or in outputs from it.
The ICO made its recommendation in the context of a new consultation exercise pertaining to the allocation of responsibilities under data protection law within the gen-AI supply chain. Its call for evidence centres on the concept of ‘controllership’ in UK data protection law.
Kathryn Wynn
Partner
Although developers may seek to sell access to AI models on non-negotiable standard terms and conditions under which the customer is assumed to be the controller and the developer the processor, a ‘one size fits all’ approach is not likely to produce accurate results
‘Controllers’ of personal data are organisations that exercise overall control over the purposes and means of the processing of personal data, while ‘processors’ of that data are organisations that merely process the data on behalf, and under the instruction, of controllers. For organisations, determining whether they are ‘controllers’ or ‘processors’ in respect of personal data is important, because each face different duties under data protection law, with the bulk of obligations falling on controllers.
The ICO said, however, that “allocation of accountability is complicated” in the context of generative AI “because of the different ways in which generative AI models, applications and services are developed, used and disseminated, but also the different levels of control and accountability that participating organisations may have”.
While it is seeking evidence from stakeholders on matters pertaining to controllership in the gen-AI supply chain, the ICO also set out its current thinking on the issues. In doing so, it reflected on the different ways in which gen-AI tools currently come into use in the market.
The ICO said distribution occurs across an ‘open-access’ to ‘closed-access’ “spectrum”. At the ‘open’ extreme, the gen-AI models are made public and deployers have can freely shape the way they operate; at the other extreme, the models are private and it is the developers that set the parameters within which they can be used by third parties.
Where organisations have scope to adopt and modify gen-AI models “at the most ‘open’ end of the spectrum using their own computing resources”, they will “likely be defining the purposes” of data processing and, in turn, “may be seen as distinct controllers, separate to the initial controller who developed the system”, the ICO said.
The ICO said, however, that it is common for those wishing to deploy gen-AI models for their own purposes to face constraints in shaping the way those tools process data. Reasons for this might include because the means of processing during deployment is, at least in part, pre-determined by decisions taken by the developer when developing the tool. The ICO said the developer and deployer could be ‘joint controllers’ in such a scenario and it advised organisations in this position to clarify their respective roles for the purposes of enabling “clear accountability” for data protection law compliance.
Kathryn Wynn
Partner
Where, on proper analysis, the developer and customer are joint controllers, they must ensure that responsibilities are fully identified and allocated to the party best placed to manage compliance
“In practice, the relationship between developers and third-party deployers in the context of generative AI will mean there are often shared objectives and influence from both parties for the processing, which means it is likely to be a joint controllership instead of a processor-controller arrangement,” the ICO said.
Developers may be joint controllers for some aspects of deployment and processors for others. Determining with clarity the different processing activities will help all entities demarcate which processing they are controllers, joint controllers or processors for and justify why,” it said.
“Different processing activities should not be lumped together when they serve different objectives or have distinct data protection risks. For example, search engines built on top of a variety of algorithmic systems or lately LLMs [large language models] can have different capabilities, functions and risks than ‘traditional’ search engines mainly using ranking systems. Distinct DPIAs [data protection impact assessments] may help demarcate the boundaries between them,” the ICO added.
Data protection expert Kathryn Wynn of Pinsent Masons said: “Although developers may seek to sell access to AI models on non-negotiable standard terms and conditions under which the customer is assumed to be the controller and the developer the processor, a ‘one size fits all’ approach is not likely to produce accurate results, fuelling the ICO’s concerns about gaps in accountability.”
“Customers looking to buy-in and deploy AI models as controllers should ensure that the developer provides sufficient information to allow them to meet data protection obligations. Where, on proper analysis, the developer and customer are joint controllers, they must ensure that responsibilities are fully identified and allocated to the party best placed to manage compliance. In those ‘joint controllership’ cases, the developer and customer will also be obliged to communicate the essence of their arrangement to individuals whose personal data they are processing,” Wynn added.
Out-Law News
20 Aug 2024