Out of multiple conversations with people at BSD conferences, I noticed that many would love to see a chatbot that provides precise information on FreeBSD—for users, admins, and developers.
I strongly believe that there should not be an official chat.freebsd.org . Local chatbots work well and can be tweaked to fit personal needs.
This documentation is written for macOS with Apple Silicon (because of the GPU support), but should work on other OSes as well.
Step 1: Install Ollama (API for Multiple LLMs)
brew install ollama ollama pull gemma3:latest
You can try deepseek-r1:latest or even deepseek-r1:70b on more powerful GPUs.
Step 2: Install Open-WebUI for a UI and Built-in Vector Database
curl -LsSf https://astral.sh/uv/install.sh | sh DATA_DIR=~/.open-webui uvx --python 3.11 open-webui@latest serve
Now browse to http://localhost:5000/
Welcome to your own, local chatbot!
... continue reading