OpenAI slapped with GDPR complaint: How do you correct your work?
Privacy activist group noyb (None of Your Business) has filed a complaint against OpenAI, alleging that the ChatGPT service violates GDPR rules since its information cannot be corrected if found inaccurate.
In the filing [PDF] with the Austrian data protection authority, the group alleges ChatGPT was asked to provide the date of birth of a given data subject. The subject, whose name is redacted in the complaint, is a public figure, so some information about him is online, but his date of birth is not. ChatGPT, therefore, had a go at inferring it but returned the wrong date.
The subject then filed a request to have the incorrect data erased, according to noyb's complaint, which alleges there was no way to prevent an inaccurate date of birth from being returned without blocking all information held on the subject.
- Jensen Huang and Sam Altman among tech chiefs invited to federal AI Safety Board
- Meta's value plummets as Zuckerberg admits AI needs more time and money
- EU tells Meta it can't paywall privacy
- The UK Digital Information Bill: Brexit dividend or data disaster?
According to the complaint: "The controller does not seem to have any option to actually correct false information, but can only 'hide' at the final output stage of the processing." In this case, the data controller it is referring to is OpenAI.
This situation is less than ideal where European data protection law the General Data Protection Regulation (GDPR) is concerned. The law is quite clear on the matter – personal data must be accurate, and individuals have a right to have it changed if it is inaccurate.
Maartje de Graaf, data protection lawyer at noyb, said: "Making up false information is quite problematic in itself. But when it comes to false information about individuals, there can be serious consequences.
"It's clear that companies are currently unable to make chatbots like ChatGPT comply with EU law when processing data about individuals. If a system cannot produce accurate and transparent results, it cannot be used to generate data about individuals. The technology has to follow the legal requirements, not the other way around."
This is not the first run-in AI models have had with privacy laws. A 2023 paper highlighted the issue, and it has been noted that Large Language Models, such as the ones on which ChatGPT is based, have difficulties staying compliant due to how they process and store information. Earlier in 2023, Italy imposed a temporary restriction on the use of ChatGPT over data privacy concerns.
A spokesperson for noyb told The Register that the complaint had been filed in Austria as that was where the complainant lives. However, the group expected it to be forwarded to the Irish authority since OpenAI is located there.
A fine is also a possibility, as is a request that authorities order OpenAI to comply with the complainant's access request. The noyb spokesperson said: "We would expect the sanction to be 'effective, proportionate and dissuasive,' as requested by the GDPR. The assessment of the amount of the fine should also take into account OpenAI's worldwide turnover."
Under GDPR's fine framework, an org may be given a penalty of up to four percent of its total global turnover of the preceding fiscal year, depending on the severity of the breach.
The Register contacted OpenAI to get its opinion on the complaint, and the Sam Altman-run super lab has yet to respond. ®