Tech News
← Back to articles

My go-to LLM tool just dropped a super simple Mac and PC app for local AI - why you should try it

read original related products more articles

Jack Wallen / Elyse Betters Picaro / ZDNET

ZDNET's key takeaways

Ollama AI devs have released a native GUI for MacOS and Windows.

The new GUI greatly simplifies using AI locally.

The app is easy to install, and allows you to pull different LLMs.

If you use AI, there are several reasons why you would want to work with it locally instead of from the cloud.

First, it offers much more privacy. When using a Large Language Model (LLM) in the cloud, you never know if your queries or results are being tracked or even saved by a third party. Also, using an LLM locally saves energy. The amount of energy required to use a cloud-based LLM is growing and could be a problem in the future.

Ergo, locally hosted LLMs.

Also: How to run DeepSeek AI locally to protect your privacy – 2 easy ways

Ollama is a tool that allows you to run different LLMs. I've been using it for some time and have found it to simplify the process of downloading and using various models. Although it does require serious system resources (you wouldn't want to use it on an aging machine), it does run fast, and allows you to use different models.

... continue reading