Skip to content
Tech News
← Back to articles

Show HN: Apfel – The free AI already on your Mac

read original get Apple MacBook Air M2 → more articles
Why This Matters

Apple's integration of an on-device language model in macOS 26 (Tahoe) marks a significant step toward privacy-focused, offline AI capabilities. The release of apfel makes this powerful technology accessible to developers and users, enabling new applications and interactions without relying on cloud services. This development signals a shift toward more secure, local AI solutions in the tech industry, with potential benefits for consumers seeking privacy and performance.

Key Takeaways

Apple ships an on-device LLM

Starting with macOS 26 (Tahoe), every Apple Silicon Mac includes a language model as part of Apple Intelligence. Apple exposes it through the FoundationModels framework - a Swift API that gives apps access to SystemLanguageModel . All inference runs on the Neural Engine and GPU. No network calls, no cloud, no API keys. The model is just there.

But Apple only uses it for Siri

Out of the box, the on-device model powers Siri, Writing Tools, and system features. There is no terminal command, no HTTP endpoint, no way to pipe text through it. The FoundationModels framework exists, but you need to write a Swift app to use it. That is what apfel does.

What apfel adds

apfel is a Swift 6.3 binary that wraps LanguageModelSession and exposes it three ways: as a UNIX command-line tool with stdin/stdout, as an OpenAI-compatible HTTP server (built on Hummingbird), and as an interactive chat with context management.