EU's Top Data Protection Board Issues Guidance on AI Development and Deployment

Starfolk

Starfolk

December 18, 2024 · 5 min read
EU's Top Data Protection Board Issues Guidance on AI Development and Deployment

The European Data Protection Board (EDPB) has published an opinion on how AI developers can use personal data to develop and deploy AI models, such as large language models (LLMs), without falling foul of the EU's privacy laws. The guidance is significant, as it provides clarity on the application of the General Data Protection Regulation (GDPR) to AI development and deployment.

The EDPB's opinion covers several key areas, including whether AI models can be considered anonymous, whether a "legitimate interests" legal basis can be used for lawfully processing personal data, and whether AI models developed with unlawfully processed data can be deployed lawfully. The Board's views are important, as they support regulatory enforcement and provide guidance to developers on how to ensure compliance with EU privacy laws.

The question of what legal basis might be appropriate for AI models to ensure compliance with the GDPR remains a hot and open one. The issue has already led to controversy, with OpenAI's ChatGPT facing complaints in Italy, Poland, and Austria over its lawful basis for processing people's data. Failing to abide by the privacy rules could lead to penalties of up to 4% of global annual turnover and/or orders to change how AI tools work.

The EDPB's opinion is intended to help oversight bodies with their decision-making. Responding to the opinion, Ireland's Data Protection Commission (DPC) suggested that it will "enable proactive, effective and consistent regulation" of AI models across the region. The DPC is set to lead on GDPR oversight of OpenAI following a legal switch late last year.

The opinion offers some steer to developers on how privacy regulators might break on crux issues such as lawfulness. However, the main message is that there won't be a one-size-fits-all solution to the legal uncertainty they face. For instance, on the question of model anonymity, the Board stresses that this must be assessed on a case-by-case basis. The document provides a non-prescriptive and non-exhaustive list of methods whereby model developers might demonstrate anonymity, such as via source selection for training data, data minimization, and filtering steps during the data preparation phase.

The opinion also looks at whether a legitimate interest legal basis can be used for AI development and deployment. This is important because there are only a handful of available legal bases in the GDPR, and most are inappropriate for AI. The Board's view is that DPAs will have to undertake assessments to determine whether legitimate interest is an appropriate legal basis for processing personal data for the development and deployment of AI models.

The EDPB's opinion leaves the door open to it being possible for AI models to meet all the criteria for relying on a legitimate interest legal basis. However, assessments must look at whether the processing actually achieves the lawful purpose and whether there is no less intrusive way to achieve the aim. The opinion also discusses measures for mitigating risks associated with web scraping, which the Board says raises "specific risks".

The opinion also weighs in on the sticky issue of how regulators should approach AI models that were trained on data that was not processed lawfully. The Board recommends regulators take into account "the circumstances of each individual case". However, the opinion appears to offer a sort of get-out clause for AI models that may have been built on shaky legal foundations, if they take steps to ensure that any personal data is anonymized before the model goes into the deployment phase.

Independent consultant Lukasz Olejnik, whose GDPR complaint against ChatGPT remains under consideration by Poland's DPA, warned that "care must be taken not to allow systematic misuse schemes". He noted that the EDPB's opinion may unintentionally legitimize the scraping of web data without proper legal bases, potentially undermining GDPR's core principle that personal data must be lawfully processed at every stage.

The EDPB's opinion is significant, as it provides much-needed clarity on how AI developers can use personal data without violating EU privacy laws. While it offers some steer to developers, it also highlights the complexity of the issue and the need for case-by-case assessments. As the use of AI models continues to grow, the opinion will play an important role in shaping the regulatory landscape and ensuring that AI development and deployment are compliant with EU privacy laws.

Similiar Posts

Copyright © 2024 Starfolk. All rights reserved.