It's Staggeringly Easy for Hackers to Trick ChatGPT Into Leaking Your Most Personal Data
OpenAI's ChatGPT can easily be coaxed into leaking your personal data — with just a single "poisoned" document. As Wired reports, security researchers revealed at this year's Black Hat hacker conference that highly sensitive information can be stolen from a Google Drive account with an indirect prompt injection attack. In other words, hackers feed a document with hidden, malicious prompts to an AI that controls your data instead of manipulating it directly with a prompt injection, one of the mo