AI implementation & data protection regulation: German authorities publish guidelines for implementing AI in compliance with the GDPR
3 min read
The German federal and state data protection authorities published guidelines for the implementation and use of AI in compliance with the European Union's regulation of personal data ("Guidelines").1
The Guidelines are primarily addressed at deployers of AI applications in the private and the public sector alike. They identify several imminent risks associated with the use of AI, including the unlawful processing of personal data and discrimination through biased data, and feature practical guidance on how to mitigate and prevent these risks, introducing a set of preemptive measures and practices such as documentation, impact assessment and AI-use-specific training of employees.
AI’s rapid rise in the face of data protection
With AI systems increasingly becoming part of everyday life and being implemented in different kinds of work environments, their relationship to the legal protection of personal data has come to the attention of the authorities, with many practical compliance questions remaining largely uncertain up to this point.
The Guidelines published by the joint authorities' conference aims to soothe some of this uncertainty in Germany by providing practical notes on the use of AI applications.
Guidance on pre-implementation considerations
The Guidelines feature practical notes on what to consider from the standpoint of data protection compliance before using AI applications. These include:
- Identifying the purpose and use cases of the AI application;
- Assessing the legal basis for the processing of any personal data in the context of the use of the AI application;
- Ensuring compliance with transparency requirements in order to collect and provide information to data subjects and/or authorities on automated decision-making, including profiling;
- Ensuring the system has sufficient flexibility to define data protection-compliant settings; e.g., disabling prompt history or executing the fine tuning of data sets to properly fulfill requests of data subjects for information, rectification or erasure under the GDPR.
Additionally, involving data protection officers as well as representatives of the employees before implementing is highly recommended.
Compliance requirements to the implementation process
The Guidelines make several suggestions in order to ensure compliance with data protection law requirements when using AI applications, particularly:
- Defining clear responsibilities (e.g., controllership and/or joint controllership), especially if a Cloud-based AI application is used;
- Implementing policies for the use of AI;
- Conducting a data protection impact assessment (Art. 35 GDPR);
- Training for the employees;
- Implementing sufficient technical and organizational measures (data protection by design, Art. 25 GDPR); e.g., through data security tools, company accounts or a data protective design.
It is further advised to closely monitor current developments in AI technology and legal guidance, ideally in the context of a company data protection routine.
Using the AI model with caution
Finally, the Guidelines stipulate a few general practices to be followed in using AI applications in accordance with data protection laws. Primarily, they advise incorporating a cautious approach to prompting and to using the output of the AI if personal data is involved. Any output of the AI application should steadily be monitored and critically examined to ensure data integrity and prevent biases in the data that might otherwise lead to unlawful discrimination.
Outlook
The Guidelines are yet another step towards defining more clear lines on implementing and using AI. It shows that the authorities are aware of the practical obstacles of compliance.
Following the discussions as to which German authority will be designated as the national supervisory authority required under the EU AI Act,2 German data protection authorities have issued a position paper3 according to which they are prepared to take on this role due to their tasks and expertise and are thus positioning themselves to become the German AI regulators.
1 Guidelines of the German Data Protection Conference of May 6, 2024, https://content.mlex.com/Attachments/2024-05-06_K3DN3E452H0TH6YG%2f20240506_DSK_Orientierungshilfe_KI_und_Datenschutz_web.pdf.
2 See https://www.whitecase.com/insight-our-thinking/ai-watch-global-regulatory-tracker-germany.
3 Decision of the German Data Protection Conference of May 3, 2024, https://www.datenschutzkonferenz-online.de/media/dskb/20240503_DSK_Positionspapier_Zustaendigkeiten_KI_VO.pdf.
White & Case means the international legal practice comprising White & Case LLP, a New York State registered limited liability partnership, White & Case LLP, a limited liability partnership incorporated under English law and all other affiliated partnerships, companies and entities.
This article is prepared for the general information of interested persons. It is not, and does not attempt to be, comprehensive in nature. Due to the general nature of its content, it should not be regarded as legal advice.
© 2024 White & Case LLP