Skip to content
Tech News
← Back to articles

Document poisoning in RAG systems: How attackers corrupt AI's sources

read original get AI Source Integrity Kit → more articles

I injected three fabricated documents into a ChromaDB knowledge base. Here's what the LLM said next.

Document Poisoning in RAG Systems: How Attackers Corrupt Your AI's Sources

I injected three fabricated documents into a ChromaDB knowledge base. Here’s what the LLM said next.

In under three minutes, on a MacBook Pro, with no GPU, no cloud, and no jailbreak, I had a RAG system confidently reporting that a company’s Q4 2025 revenue was $8.3M, down 47% year-over-year, with a workforce reduction plan and preliminary acquisition discussions underway.

The actual Q4 2025 revenue in the knowledge base: $24.7M with a $6.5M profit.

I didn’t touch the user query. I didn’t exploit a software vulnerability. I added three documents to the knowledge base and asked a question.

Lab code: github.com/aminrj-labs/mcp-attack-labs/labs/04-rag-security

git clone && make attack1 — 10 minutes, no cloud, no GPU required

This is knowledge base poisoning, and it’s the most underestimated attack on production RAG systems today.

The Setup: 100% Local, No Cloud Required

... continue reading