Apple Can Create Smaller On-Device AI Models from Google's Gemini
thm
25 points
7 comments
March 25, 2026
Related Discussions
Found 5 related stories in 54.0ms across 3,471 title embeddings via pgvector HNSW
- Gemini 3.1 Flash-Lite: Built for intelligence at scale meetpateltech · 51 pts · March 03, 2026 · 63% similar
- Gemma 4: Byte for byte, the most capable open models meetpateltech · 21 pts · April 02, 2026 · 61% similar
- Google releases Gemma 4 open models jeffmcjunkin · 1306 pts · April 02, 2026 · 58% similar
- Gemini 3.1 Flash Live: Making audio AI more natural and reliable meetpateltech · 12 pts · March 26, 2026 · 57% similar
- Apple AI servers unused in warehouses due to low Apple Intelligence usage _____k · 85 pts · March 02, 2026 · 54% similar
Discussion Highlights (3 comments)
MBCook
Wouldn’t it be interesting if Apple provided different models to different iPhones? So due to hardware capabilities the iPhone 20 Pro gets an X billion parameter version but the regular 20 gets only gets (2/3 * X) billion? That would provide an interesting point of hardware differentiation between the regular and pro models, as well as between each model year.
ZeroGravitas
Does it make sense for a single model to be used for all on device LLM tasks or for each app to provide its own customized one? My gut feel is the former but not sure if that's actually true.
jerrythegerbil
The announcement of FunctionGemma, the announcement of Apple partnering with Google’s Gemini, and now Apple can create smaller on-device AI models. It’s been clear since December of last year what the planned trajectory and partnerships would be.