Tech News
← Back to articles

Ask HN: What's Your Useful Local LLM Stack?

read original related products more articles

What I’m asking HN:

What does your actually useful local LLM stack look like?

I’m looking for something that provides you with real value — not just a sexy demo.

---

After a recent internet outage, I realized I need a local LLM setup as a backup — not just for experimentation and fun.

My daily (remote) LLM stack:

- Claude Max ($100/mo): My go-to for pair programming. Heavy user of both the Claude web and desktop clients. - Windsurf Pro ($15/mo): Love the multi-line autocomplete and how it uses clipboard/context awareness. - ChatGPT Plus ($20/mo): My rubber duck, editor, and ideation partner. I use it for everything except code.

Here’s what I’ve cobbled together for my local stack so far:

Tools

- Ollama: for running models locally - Aider: Claude-code-style CLI interface - VSCode w/ continue.dev extension: local chat & autocomplete

... continue reading