OpenAI now offers better options for you to keep your data to yourself. They will still be stored for some time, but you can now choose not to use them and delete them after 30 days.

One of the many things ChatGPT has been criticized for is its handling of users' data. This became especially evident back in March when there was a major data leak, which meant, among other things, that you could access other users' history at the headline level and see what had been talked about.

There are also concerns about personal data, which has led the Italian Data Protection Authority to investigate whether OpenAI complies with GDPR. An investigation that could lead to fines in the millions and, not least, demands to shut down access to ChatGPT in Italy completely.

What does OpenAI use data for?

It is a common misconception that ChatGPT learns from the conversations users have with the system. The system does not because the language model that drives the chat system is fully trained. But it is then a truth with modifications because OpenAI uses data to train the interface of the chatbot itself so that the conversations are experienced better and the security, compared to what ChatGPT will help with, becomes better. This means that one cannot teach ChatGPT new knowledge. But your conversations are used to improve the system in other ways.

When they use our data for this kind of training, it also means that it's not just the machine learning algorithms that can access data. So do data engineers at OpenAI, which means that other people may see and read your conversations with the system. When you hear stories about how engineers at Tesla, with access to the video data the cars collect, have shared funny and special videos for entertainment, you may worry that the same could happen to data shared with ChatGPT. Another example of what problems it can cause when there is no control over data was seen at Samsung. Here, employees used ChatGPT to process confidential trade secrets that had thus been leaked from the company.

New ways to keep track of your data

OpenAI now offers new opportunities to keep better track of your data. First of all, it becomes possible to turn off the conversation history. This means that OpenAI will not use data for training, just as you cannot access the history in the interface. However, OpenAI states that they continue to store conversations for 30 days to review them on suspicion of abuse.

In addition, ChatGPT has an export function, which makes it easier to export your data from the system and thus also better understand what the system collects.

Finally, work is underway on a solution for professional users with a subscription to provide better data control for business users.

In education, we must continue to be aware that ChatGPT requires a login and, thus, an email address. This is considered personal data, which requires that you, as an educational institution, have a data processing agreement before you can use the system. Read also the article about the ethical use of artificial intelligence in education.

Kilder

New ways to manage your data in ChatGPT
ChatGPT users can now turn off chat history, allowing you to choose which conversations can be used to train our models.
ChatGPT banned in Italy over privacy concerns
The country’s data-protection regulator has serious privacy concerns over the technology.
OpenAI says a bug leaked sensitive ChatGPT user data | Engadget
OpenAI announced Friday that the chat history bug from earlier in the week might have also leaked user and payment data…
Special Report: Tesla workers shared sensitive images recorded by customer cars
Between 2019 and 2022, groups of Tesla employees privately shared via an internal messaging system sometimes highly invasive videos and images recorded by customers’ car cameras.
Oops: Samsung Employees Leaked Confidential Data to ChatGPT
Employees submitted source code and internal meetings to ChatGPT just weeks after the company lifted a ban on using the chatbot.