Guarantor for Personal Data Protection Issues Warning Against AI-Based Services
The Guarantor for the protection of personal data has adopted a warning measure against users of services based on artificial intelligence, such as Grok, ChatGPT, and Clothoff, which allow users to generate and share content starting from real images or voices. This move comes after an investigation launched ex officio revealed that these services often make it extremely easy to illicitly use images and voices of third parties without any legal title.
Violations of Rights and Fundamental Freedoms
The Authority notes that the use of such tools and the dissemination of the contents thus generated can lead to serious violations of the rights and fundamental freedoms of the persons involved. This can occur when the consent of the interested parties is not obtained, and the dissemination of the content is done via social media or other digital services and messaging applications. The Guarantor recalls that such actions can lead to possible types of crime and sanctions, as provided for by European legislation on the protection of personal data.
Design and Development of AI-Based Services
The Guarantor also reminded the providers of these services of the need to design, develop, and make available applications and platforms in such a way as to guarantee that users can use them in compliance with privacy regulations. This means that the services should be designed to prevent the illicit use of images and voices of third parties and ensure that users are aware of the potential consequences of their actions.
The provision is being published in the Official Journal, and the Guarantor has already taken action against some of these services, including Clothoff, which was the recipient of a blocking measure last October. The Guarantor’s warning serves as a reminder to users and providers of AI-based services to respect the rights and fundamental freedoms of individuals and to comply with European legislation on the protection of personal data. For more information, visit Here

