I want everything local — no cloud, no remote code execution.
That’s what a friend said. That one-line requirement, albeit simple, would need multiple things to work in tandem to make it happen.
What does a mainstream LLM (Large Language Model) chat app like ChatGPT or Claude provide at a high level?
Ability to use chat with a cloud hosted LLM,
Ability to run code generated by them mostly on their cloud infra, sometimes locally via shell,
Ability to access the internet for new content or services.
With so many LLMs being open source / open weights, shouldn't it be possible to do all that locally? But just local LLM is not enough, we need a truely isolated environment to run code as well.
So, LLM for chat, Docker to containerize code execution, and finally a browser access of some sort for content.
🧠 The Idea
We wanted a system where:
... continue reading