Skip to content
Tech News
← Back to articles

Stop letting ChatGPT and other AI chatbots train on your data. Here’s why—and how

read original get Data Privacy Privacy Sticker → more articles
Why This Matters

This article highlights the privacy risks associated with AI chatbots, emphasizing that user data can be used for training and potentially expose sensitive information. It underscores the importance for consumers and the tech industry to adopt privacy-preserving practices to safeguard personal and confidential data. Implementing user controls can help maintain trust and security in AI interactions.

Key Takeaways

Whatever you feed them can become public. Keep that in mind, and take these steps to protect yourself. When you interact with a chatbot, there’s a good chance that everything you say, and every prompt you give, isn’t just used to generate replies to your queries. Nearly every chatbot company on the planet also uses the information you provide to train its AI models. This can leave your privacy—and even your employer’s confidential information—exposed. But you can mitigate these privacy risks by telling chatbots not to use your data for training. Here’s how.