Apfel – The free AI already on your Mac

Github: https://github.com/Arthur-Ficial/apfel Comments URL: https://news.ycombinator.com/item?id=47624645 Points: 297 # Comments: 57

Apfel – The free AI already on your Mac
Apfel – The free AI already on your Mac Photo: Hacker News

The free AI already on your Mac.

Every Mac with Apple Silicon has a built-in LLM.

Apple locked it behind Siri.

apfel sets it free - as a CLI tool, an OpenAI-compatible server, and a chat.

Apple Silicon · macOS Tahoe · Apple Intelligence enabled
Real commands.

Real outputs.

All running on Apple Silicon.

The AI is already installed on your Mac.

Apple ships it with macOS.

apfel just gives you a way to talk to it - from your terminal, from your code, from anywhere.

CLI tool, HTTP server, or interactive chat.

Pick the one that fits.

Pipe-friendly and composable.

Works with jq, xargs, and your shell scripts.

stdin, stdout, JSON output, file attachments, proper exit codes.

Drop-in replacement at localhost:11434.

Point any OpenAI SDK at it and go.

Streaming, tool calling, CORS, response formats.

Multi-turn conversations with automatic context management.

Five trimming strategies.

System prompt support.

All on your Mac.

Apple built an LLM into your Mac.

apfel gives it a front door.

Starting with macOS 26 (Tahoe), every Apple Silicon Mac includes a language model as part of Apple Intelligence.

Apple exposes it through the FoundationModels framework - a Swift API that gives apps access to SystemLanguageModel .

All inference runs on the Neural Engine and GPU.

No network calls, no cloud, no API keys.

The model is just there.

But Apple only uses it for Siri
Out of the box, the on-device model powers Siri, Writing Tools, and system features.

There is no terminal command, no HTTP endpoint, no way to pipe text through it.

The FoundationModels framework exists, but you need to write a Swift app to use it.

That is what apfel does.

apfel is a Swift 6.3 binary that wraps LanguageModelSession and exposes it three ways: as a UNIX command-line tool with stdin/stdout, as an OpenAI-compatible HTTP server (built on Hummingbird), and as an interactive chat with context management.

It handles the things Apple's raw API does not: proper exit codes, JSON output, file attachments, five context trimming strategies for the small 4096-token window, real token counting via the SDK, and conversion of OpenAI tool schemas to Apple's native Transcript.ToolDefinition format.

Shell scripts in the demo/ folder.

Install apfel first, then grab the ones you want.

Natural language to shell command.

Say what you want, get the command.

Pipe chains from plain English.

awk, sed, sort, uniq - generated for you.

Narrates your Mac's system activity like a nature documentary.

Explain any command, error message, or code snippet in plain English.

What's this directory?

Instant project orientation for any codebase.

Summarize recent git commits in a few sentences.

Change one URL.

Keep your code.

apfel speaks the OpenAI API.

Any client library, any framework, any tool that talks to OpenAI can talk to your Mac's AI instead.

Just change the base URL.

From zero to 507 stars and counting.

Two viral spikes: 123 stars on March 31, 295 on April 3.

The first public release of Apple's on-device LLM as a command-line tool.

Two commands.

Ten seconds.

You're in.

Tools built on top of Apple's on-device AI.

Native macOS SwiftUI debug GUI.

Chat with Apple Intelligence, inspect requests and responses, logs, speech-to-text, text-to-speech - all on-device.

Source: This article was originally published by Hacker News

Read Full Original Article →

Share this article

Comments (0)

No comments yet. Be the first to comment!

Leave a Comment

Maximum 2000 characters