Skip to content
Tech News
← Back to articles

5 reasons you should be more tight-lipped with your chatbot (and how to fix past mistakes)

read original get Privacy and Data Security Guide β†’ more articles
Why This Matters

As chatbots become more integrated into daily life, understanding the risks of sharing personal information is crucial for both consumers and the tech industry. Over-sharing can lead to privacy breaches, data leaks, and unintended consequences, highlighting the need for better user awareness and safeguards. Protecting personal data in AI interactions is essential to maintain trust and privacy in an increasingly digital world.

Key Takeaways

Kerry Wan/ZDNET

Follow ZDNET: Add us as a preferred source on Google.

How personal do you get with your chatbot?

Does it interpret your lab results? Help you sort out your finances? Offer advice at 2 a.m. when your worries are particularly existential?

Without thinking about it too deeply, you might be revealing a whole trove of personal information about yourself, and that could be a problem.

At a time when people are increasingly integrating chatbots into their everyday lives, researchers are trying to work out the implications of feeding AI personal information.

Also: 43% of workers say they've shared sensitive info with AI - including financial and client data

By now, you've likely heard stories of people forging romantic relationships with chatbots or using them as life coaches and therapists. In fact, just over half of US adults use large language models, according to a 2025 study from Elon University. What's more, chatbots are designed to be friendly and keep people chatting -- and talking about themselves.

"The ultimate problem is that you just can't control where the information goes, and it could leak out in ways that you just don't anticipate," said Jennifer King, privacy and data policy fellow at Stanford Institute for Human-Centered Artificial Intelligence.

As abstract as that theory may sound, researchers like King say it's worth considering exactly what you're telling chatbots, and what repercussions that info might have in the future.

... continue reading