2 min read
|
Saved February 14, 2026
|
Copied!
Do you care about this?
The article warns users of Free, Plus, and Pro ChatGPT accounts to disable the "Improve the model for everyone" setting, which is on by default. This setting allows user interactions to train the AI, raising privacy concerns, especially when using external tools. It advocates for businesses to use Enterprise accounts instead to prevent unintentional data sharing.
If you do, here's more
If you have a Free, Plus, or Pro ChatGPT account, check your settings immediately. Navigate to Settings > Data Controls and ensure that the option “Improve the model for everyone” is turned off. By default, this setting is enabled, meaning your conversations may be used to train the AI for all users. Many people are unaware of this toggle, so they’re unknowingly contributing to the AI’s learning process every time they interact with it. This concern escalates if you use Connectors, which link external tools and data sources, as information from these can also be utilized for training if the setting is on.
For organizations, it’s advisable to use Business or Enterprise accounts instead of personal ones. These accounts have the data-sharing option disabled by default, preventing accidental data sharing. Relying on personal accounts presents risks, such as issues with data ownership and access if an employee leaves the company. The article stresses the importance of ensuring that everyone in your team, as well as in your personal life, checks this setting for privacy reasons.
There are also mentions of new settings that may appear silently when creating a GPT with a knowledge file, which opt users in for training data on those specific chats. This raises concerns about transparency in user experience design. The author expresses uncertainty about whether data is genuinely safe when using connectors, highlighting a broader issue of control over personal data in AI applications. It’s a gap in awareness that leaders should address before facing potential data leaks.
Questions about this article
No questions yet.