Skip to content
Tech News
← Back to articles

Show HN: Agent-cache – Multi-tier LLM/tool/session caching for Valkey and Redis

read original get Redis Enterprise Cloud → more articles
Why This Matters

Agent-cache introduces a multi-tier caching solution for AI agents, integrating LLM responses, tool results, and session states into a unified cache backed by Valkey or Redis. This innovation addresses limitations of existing options by supporting multiple frameworks and enabling seamless multi-tier caching, with upcoming features like streaming support. Its flexibility and integration capabilities make it a valuable tool for developers building scalable AI applications.

Key Takeaways

Multi-tier exact-match cache for AI agents backed by Valkey or Redis. LLM responses, tool results, and session state behind one connection. Framework adapters for LangChain, LangGraph, and Vercel AI SDK. OpenTelemetry and Prometheus built in. No modules required - works on vanilla Valkey 7+ and Redis 6.2+.

Shipped v0.1.0 yesterday, v0.2.0 today with cluster mode. Streaming support coming next.

Existing options locked you into one tier (LangChain = LLM only, LangGraph = state only) or one framework. This solves both.

npm: https://www.npmjs.com/package/@betterdb/agent-cache Docs: https://docs.betterdb.com/packages/agent-cache.html Examples: https://valkeyforai.com/cookbooks/betterdb/ GitHub: https://github.com/BetterDB-inc/monitor/tree/master/packages...

Happy to answer questions.