Latest Tech News

Stay updated with the latest in technology, AI, cybersecurity, and more

Filtered by: ollama Clear Filter

Finding thousands of exposed Ollama instances using Shodan

The rapid deployment of large language models (LLMs) has introduced significant security vulnerabilities due to misconfigurations and inadequate access controls. This paper presents a systematic approach to identifying publicly exposed LLM servers, focusing on instances running the Ollama framework. Utilizing Shodan, a search engine for internet-connected devices, we developed a Python-based tool to detect unsecured LLM endpoints. Our study uncovered over 1,100 exposed Ollama servers, with appro

This is the fastest local AI I've tried, and it's not even close - how to get it

Jack Wallen / Elyse Betters Picaro / ZDNET ZDNET's key takeaways The gpt-oss:20b model is very fast. You'll get blazing-fast answers to your queries with gpt-oss:20b. With the latest version of Ollama installed, you can use this model. Let's talk about local AI and speed. There are a lot of factors that go into getting the most speed out of your AI, such as: Whether you have a dedicated GPU. The context length you use (the smaller, the faster). The complexity of your query. The LLM you

Topics: 20b gpt ollama oss use

Turn your Mac into a local ChatGPT with OpenAI’s new free model

This week, OpenAI released its long-awaited open weight model called gpt-oss. Part of the appeal of gpt-oss is that you can run it locally on your own hardware, including Macs with Apple silicon. Here’s how to get started and what to expect. Models and Macs First, gpt-oss comes in two flavors: gpt-oss-20b and gpt-oss-120b. The former is described as a medium open weight model, while the latter is considered a heavy open weight model. The medium model is what Apple silicon Macs with enough res

Topics: 20b gpt model ollama oss

How to set up and run OpenAI’s ‘gpt-oss-20b’ open weight model locally on your Mac

This week, OpenAI released its long-awaited open weight model called gpt-oss. Part of the appeal of gpt-oss is that you can run it locally on your own hardware, including Macs with Apple silicon. Here’s how to get started and what to expect. Models and Macs First, gpt-oss comes in two flavors: gpt-oss-20b and gpt-oss-120b. The former is described as a medium open weight model, while the latter is considered a heavy open weight model. The medium model is what Apple silicon Macs with enough res

Topics: 20b gpt model ollama oss

My go-to LLM tool just dropped a super simple Mac and PC app for local AI - why you should try it

Jack Wallen / Elyse Betters Picaro / ZDNET ZDNET's key takeaways Ollama AI devs have released a native GUI for MacOS and Windows. The new GUI greatly simplifies using AI locally. The app is easy to install, and allows you to pull different LLMs. If you use AI, there are several reasons why you would want to work with it locally instead of from the cloud. First, it offers much more privacy. When using a Large Language Model (LLM) in the cloud, you never know if your queries or results are b

Topics: ai app gui ollama use

Ollama's new app

Ollama’s new app is now available for macOS and Windows. An easier way to chat with models Ollama’s macOS and Windows now include a way to download and chat with models. Chat with files Ollama’s new app supports file drag and drop, making it easier to reason with text or PDFs. For processing large documents, Ollama’s context length can be increased in the settings. Note: this will require more memory. Multimodal support Building on Ollama’s new multimodal engine, images can be sent to mod

Ollama has a native front end chatbot now

Ollama’s new app is now available for macOS and Windows. An easier way to chat with models Ollama’s macOS and Windows now include a way to download and chat with models. Chat with files Ollama’s new app supports file drag and drop, making it easier to reason with text or PDFs. For processing large documents, Ollama’s context length can be increased in the settings. Note: this will require more memory. Multimodal support Building on Ollama’s new multimodal engine, images can be sent to mod

Show HN: Elelem, a tool-calling CLI for Ollama and DeepSeek in C

Elelem Chat Client with Tool-Calling Support An interactive C-based command-line chat client that enables AI models to execute real-world actions through a comprehensive tool-calling system. 🚀 Quick Start Prerequisites Ubuntu/Debian: sudo apt-get update sudo apt-get install build-essential libglib2.0-dev libjson-glib-dev libsoup2.4-dev libreadline-dev Fedora/RHEL: sudo dnf install gcc glib2-devel json-glib-devel libsoup-devel readline-devel macOS: brew install glib json-glib libsoup re

Topics: ai api ollama tool tools

Show HN: Tool to Automatically Create Organized Commits for PRs

Git Smart Squash Use AI to transform your messy commit history into clean, logical commits that reviewers will love Why Use Git Smart Squash? Ever spent 30 minutes reorganizing commits before a PR? We've all been there. Git Smart Squash uses AI to automatically organize your changes into logical, well-structured commits in seconds. What It Does Before (your typical feature branch): * 7f8d9e0 fix tests * 6c5b4a3 typo * 5a4b3c2 more auth changes * 4d3c2b1 WIP: working on auth * 3c2b1a0 updat