The Catalan Data Protection Authority has published Recommendation 1/2023, to warn the Generalitat, the municipalities, the schools, the universities and the other entities in its area of action of the doubts existing in Europe about the compliance of this artificial intelligence tool with the GDPR.
The Catalan Data Protection Authority (APDCAT) has published Recommendation 1/2023, to warn entities and bodies within its scope of action about the risks of using the ChatGPT artificial intelligence tool, which regarding compliance with data protection requirements. The document is addressed to the Generalitat, town councils, schools and public and private universities in Catalonia and professional associations, among others, and recommends that the ChatGPT tool not be incorporated in the exercise of public functions and the provision of public services, when personal data is processed, until the European Data Protection Board (EDPB) pronounces on the matter.
The recommendation reflects the manifest and growing concern in recent weeks for the impact that the use of artificial intelligence, and more specifically the ChatGPT service, can have on the rights and freedoms of the natural persons affected.
Among other things, it refers to the announcement by the European Data Protection Board to create a task group to cooperate and exchange information on actions that data protection authorities can take in relation to this matter, in accordance with the consistency principle contained in the General Data Protection Regulation. This, following the blocking of the tool by the Garante per la Protezione dei Dati Personali, the control authority in matters of data protection in Italy.
The APDCAT warns that today the use of ChatGPT generates significant doubts about the personal information it deals with, both that of the people who use it directly and that of third parties who may be using it. This may include the processing of sensitive information or information relating to vulnerable groups, such as minors, without control and without an adequate legal basis.
The APDCAT also warns of the doubts due to the lack of control that the affected persons have over this treatment, as well as the practical impossibility of exercising the rights of informative self-determination (access, rectification, deletion, etc.). It also warns of clear ignorance of security conditions, data retention periods or the communication of data to third parties, among other issues.
Privacy-friendly AI
In the recommendation, the APDCAT recognizes artificial intelligence as a tool to support decision-making in any field, but at the same time demands that its development respects the regulatory framework in the field of data protection and does not supose a damage for to the rights and freedoms of citizens.
In this regard, he recalls that technology currently allows profiles and patterns to be drawn from personal data, which can be used to directly influence people, and warns that in the face of the massive and intensive use of data, it is necessary to find the right tools to protect the rights and freedoms