Tech News
← Back to articles

Here’s why I ditched ChatGPT and moved to local AI

read original related products more articles

Dhruv Bhutani / Android Authority

I was one of the first people to jump on the ChatGPT bandwagon. The convenience of having an all-knowing research assistant available at the tap of a button has its appeal, and for a long time, I didn’t care much about the ramifications of using AI. Fast forward to today, and it’s a whole different world. There’s no getting around the fact that you are feeding an immense amount of deeply personal information, from journal entries to sensitive work emails, into a black box owned by trillion-dollar corporations. What could go wrong?

Now, there’s no going back from AI, but there are use cases for it where it can work as a productivity multiplier. And that’s why I’ve been going down the rabbit hole of researching local AI. If you’re not familiar with the concept, it’s actually fairly simple. It is entirely possible to run a large language model, the brains behind a tool like ChatGPT, right off your computer or even a phone. Of course, it won’t be as capable or all-knowing as ChatGPT, but depending on your use case, it might still be effective enough. Better still, no data leaves your device, nor is there a monthly subscription fee to consider. But if you’re concerned that pulling this off requires an engineering degree, think again. In 2025, running a local LLM is shockingly easy, with tools like LM Studio and Ollama making it as simple as installing an app. After spending the last few months running my own local AI, I can safely say I’m never going back to being purely cloud-dependent. Here’s why.

Have you considered running a local AI on your computer? 37 votes Yes, I'm already using it. 24 % Yes, I've considered it. 38 % No, I didn't know it was possible. 24 % No, I don't want to. ChatGPT and Gemini work for me. 14 %

Privacy

Dhruv Bhutani / Android Authority

We’ve all pasted something into ChatGPT that we probably shouldn’t have. Perhaps it was a code snippet at work that you were trying to make sense of. Or perhaps a copy of a contract, maybe some embargoed information, or even just a really personal journal entry that you don’t feel comfortable exposing to our corporate overlords. Every time you hit send on a cloud-based AI, that data is processed on an entirely opaque server that will inevitably use your data for the greater good of AI-kind.

Here’s the deal. Alongside my journalistic endeavors, I run a business where I’m regularly exposed to NDA-protected information. Beyond the obvious privacy risk, it would be illegal for me to share this information with a public AI tool. However, running a local LLM flips the script entirely. I’ve tried many tools, but these days, I’m testing out AnythingLLM. It’s a fantastically simple desktop tool that lets you chat with your documents entirely on your own computer. This lets me do things like feed it tax statements, invoices, bank statements, and even NDA-protected documents, and ask it to summarise things like expenses or flag clauses that I should keep an eye out for. Because the entire LLM is running on my computer, I know this data isn’t being beamed to offshore servers for processing, and it empowers me to have faster AI-driven analysis for data that is strictly not meant to be shared. Effectively, this is like running the enterprise version of ChatGPT or Copilot but on your computer with little to no cost attached.

Grammar checking

Dhruv Bhutani / Android Authority

... continue reading