ChatGPT and sensitive data : What your company can no longer ignore.

With over 13 million monthly visitors in France, ChatGPT has firmly established itself in daily usage. Used as a virtual assistant, search engine, or writing tool, it is now an integral part of work environments—often without any clear supervision.
But this accessibility comes at a cost: data confidentiality. Behind the smooth interface lies a system based on machine learning, which uses entered queries to refine its responses. In other words: what you input, it records, remembers, and reuses.
A critical question: What does ChatGPT do with your data ?
If you share sensitive information in a query—client names, internal procedures, confidential files—you risk seeing that data resurface in other generated responses. Certainly, OpenAI states that it does not sell this data to third parties and mainly uses it to improve the model’s performance. But the line between “improvement” and “leak” is a thin one.
The textual data you enter—as well as geolocation, account info, browsing history, and cookies—is stored on secure servers… located in the United States. In the age of GDPR and digital sovereignty, this raises clear legal questions for European businesses.
The illusion of confidentiality: when AI talks too much.
In March 2024, a security flaw exposed users’ private conversation titles and payment information linked to ChatGPT Plus subscriptions. Even though the incident lasted only nine hours, it was enough to expose data from 1.2% of users. No credit card numbers, true—but names, addresses, and card types were revealed. Ultra-sensitive and highly confidential data.
And this isn’t an isolated case. A recent study reveals that ChatGPT, when prompted with certain trick queries, can reveal data from its initial training. Among the exposed items: names, phone numbers, emails—and sometimes even the contact details of business executives.
For instance, in 2023, Samsung engineers copied and pasted confidential source code into ChatGPT to identify bugs. The result: this data became part of the model’s usable dataset, causing an internal crisis and leading the Korean giant to completely ban the tool.
In the same vein, a June 2025 IFOP-Talan survey shows that 68% of employees using ChatGPT at work do not inform their management. This autonomy may seem efficient—until it exposes legal documents, confidential agreements, or business data to an AI connected to external servers.
Even anonymized, this data can feed into other responses generated by the tool. Imagine: what your team types today could, tomorrow, help a competitor craft their strategy. Your method, your organization, your expansion plans—these are all competitive advantages you might unintentionally give away.
An analysis of a 100,000-user sample revealed over 400 sensitive data leaks in just one week: legal documents, internal memos, source code… Once exposed, this content cannot be erased from the network. The consequences go far beyond embarrassment: GDPR fines, potential lawsuits, loss of trust, damaged reputation.
How to protect your data without giving up AI ?
At ATI4, we made a clear choice: to strictly regulate the use of generative AI internally, to protect our data and that of our clients.
This involves training, security charters, but also technical restrictions: allowed data types, access control, isolated environments. We choose to never send sensitive data to a non-sovereign model.
We are also exploring alternatives that better respect European frameworks. In 2024, Mistral AI, a French startup backed by €400 million in funding, launched an open-source model hosted on private infrastructure. A generative AI designed to comply with GDPR requirements.
Don’t fall into the trap of thinking a tool is “without consequences.” Every message you write is data transmitted, stored, and analyzed. In sensitive sectors—finance, energy, defense, healthcare—even a poorly phrased query can become a cybersecurity vulnerability. Don’t leave your strategy in the hands of an opaque algorithm. Confidentiality is not optional—it’s a responsibility.
Find out what’s new at the company.
Because mixing fun and work is at the heart of our philosophy, we always try to make a special place for it in our business life.