Grove: Distributed ML Training over AirDrop
swar_ja
32 points
1 comment
March 25, 2026
Related Discussions
Found 5 related stories in 57.1ms across 3,471 title embeddings via pgvector HNSW
- Language model teams as distributed systems jryio · 87 pts · March 16, 2026 · 46% similar
- Autoresearch: Agents researching on single-GPU nanochat training automatically simonpure · 82 pts · March 07, 2026 · 44% similar
- NanoGPT Slowrun: Language Modeling with Limited Data, Infinite Compute sdpmas · 147 pts · March 04, 2026 · 44% similar
- Floci – A free, open-source local AWS emulator shaicoleman · 122 pts · March 21, 2026 · 43% similar
- Ollama is now powered by MLX on Apple Silicon in preview redundantly · 95 pts · March 31, 2026 · 43% similar
Discussion Highlights (1 comments)
swar_ja
Python library for training models across MacBooks over wireless. Uses AWDL for discovery and fallback transport, upgrades to WiFi when available. Works on networks with client isolation (eduroam etc.). grove start train.py -n 4 on one Mac, grove join on the others. DiLoCo for communication-efficient sync, optional SparseLoCo compression (~30x) for slower links. Swift sidecar for the AWDL layer since Network.framework isn't accessible from Python. macOS/Apple Silicon only, built on MLX.